[go: up one dir, main page]

WO2025199487A1 - Systems and methods for an augmented reality guidance system - Google Patents

Systems and methods for an augmented reality guidance system

Info

Publication number
WO2025199487A1
WO2025199487A1 PCT/US2025/020993 US2025020993W WO2025199487A1 WO 2025199487 A1 WO2025199487 A1 WO 2025199487A1 US 2025020993 W US2025020993 W US 2025020993W WO 2025199487 A1 WO2025199487 A1 WO 2025199487A1
Authority
WO
WIPO (PCT)
Prior art keywords
needle
smartphone
augmented reality
virtual space
medical needle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/US2025/020993
Other languages
French (fr)
Inventor
Ming Li
Nicole VARBLE
Laetitia SACCENTI
Bradford Wood
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Koninklijke Philips NV
US Department of Health and Human Services
Original Assignee
Koninklijke Philips NV
US Department of Health and Human Services
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips NV, US Department of Health and Human Services filed Critical Koninklijke Philips NV
Publication of WO2025199487A1 publication Critical patent/WO2025199487A1/en
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/361Image-producing devices, e.g. surgical cameras
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/102Modelling of surgical devices, implants or prosthesis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/105Modelling of the patient, e.g. for ligaments or bones
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/107Visualisation of planned trajectories or target regions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2048Tracking techniques using an accelerometer or inertia sensor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/363Use of fiducial points
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/365Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image

Definitions

  • a needle insertion guidance system includes an anatomical model, the anatomical model having an outer surface defining an internal space having representations of internal organs disposed therein; and a smartphone in visual communication with the anatomical model, the smartphone being coupled to a medical needle through a needle guide structure engaged to the smartphone, the smartphone having a camera for communicating real-time images to a display of the smartphone, the display being in operative communication with processor, wherein the processor is in communication with a memory, the memory including instructions executable by the processor to: execute, at the smartphone, an augmented reality application for providing an augmented reality representation in a three-dimensional virtual space of the anatomical model having digital organs that correspond to the internal organs of the anatomical model, generate, at the three-dimensional virtual space, a planned trajectory and a planned depth for the medical needle, overlaying, by the augmented reality application, the real-time images of the medical needle with the three-dimensional virtual space of the anatomical model, and displaying, in the three-dimensional virtual space, an orientation of the medical needle
  • the memory further includes instructions executable by the processor to: calibrate the three-dimensional virtual space representing the anatomical model and digital organs with the medical needle coupled to the needle guide structure engaged to the smartphone, spawn a virtual needle in the three-dimensional virtual space, and manipulate the virtual needle in the three-dimensional virtual space with respect to the digital organs displayed on the smartphone, wherein the planned trajectory of the medical needle is defined in the three-dimensional virtual space by superposing the medical needle with the virtual needle.
  • the augmented reality application is operable for providing visual feedback of the orientation and position of the medical needle relative to the planned trajectory in the three-dimensional virtual space displayed on the smartphone.
  • the augmented reality application is also operable for displaying the planned depth of the medical needle along the planned trajectory to a virtual target that is overlaid on real-time images of the medical needle and an entry point defined along the anatomical model, wherein augmented reality application is operable for displaying the planned depth defined by a plurality of graduations denoted along the planned trajectory.
  • the needle guide structure is positioned along the smartphone such that the medical needle is visible on a camera of the smartphone to ensure visibility of the medical needle on the display of the smartphone, wherein the needle guide structure fixes in position the medical needle when coupled to the needle guide structure relative to the planned trajectory defined in the three-dimensional virtual space displayed on the smartphone.
  • the smartphone includes one or more sensors for providing real-time tracking of the position and orientation of the smartphone including the position and orientation of the medical needle coupled to the smartphone relative to the anatomical model in the three-dimensional virtual space displayed on the smartphone.
  • the augmented reality application is also operable for planning the planned trajectory of the medical needle between an entry point defined along the anatomical model and a virtual target within the anatomical model in the three-dimensional virtual space displayed on the smartphone.
  • the augmented reality application is operable for planning the planned trajectory of the medical needle by aligning a projected needle with a digital target in the three-dimensional virtual space displayed on the smartphone.
  • the needle guidance system includes a fiducial in visual communication with the smartphone and the anatomical model for calibrating the three-dimensional virtual space with the medical needle, wherein the augmented reality application is operable to provide registration of the actual path of the medical needle with the planned trajectory of the medical needle in the three- dimensional virtual space displayed on the smartphone.
  • the augmented reality application is also operable for generating a virtual target to establish the planned trajectory and the planned depth by the medical needle.
  • the display of the digital organs within the anatomical model by the augmented reality application is generated by inputting three-dimensional scans of the anatomical model to the augmented reality application and segmenting the inputted three-dimensional scans to generate the three-dimensional virtual space displayed on the smartphone.
  • a method for needle insertion guidance includes registering an augmented reality application operable on a smartphone relative to a target location within an anatomical model using a fiducial positioned outside the anatomical model by moving and rotating a three-dimensional virtual space through the augmented reality application and displayed on the smartphone, planning a planned trajectory of a medical needle coupled to the smartphone through a needle guide structure in the three-dimensional virtual space from an entry point defined along the anatomical model to the target location within the anatomical model in the three-dimensional virtual space, planning a planned depth of the medical needle in the three-dimensional virtual space from the entry point to the target location in the three-dimensional virtual space, and adjusting the planned depth of the medical needle along the planned trajectory relative to the target location in the three- dimensional virtual space, wherein planning the planned trajectory of the medical needle includes aligning a projected needle of the medical needle with a virtual needle generated by the augmented reality application in the three-dimensional virtual space and then aligning the virtual needle with a virtual target defined at the target location in the three
  • the augmented reality application is operable for changing the color of the planned trajectory from one color to another color when aligning the medical needle with the planned trajectory.
  • the medical needle may be decoupled from the needle guide structure once the planned trajectory and planned depth are established within the three-dimensional virtual space, and wherein the planned depth is adjusted by aligning a tip of the virtual needle to the virtual target after the medical needle is detached from the needle guide structure.
  • FIG. 1 is a simplified block diagram of a needle insertion guidance system having an augmented reality application operable on a smartphone that generates a planned trajectory and planned depth for a medical needle handled by a user for insertion into an anatomical model or patient.
  • FIG. 2A is an image of the needle insertion guidance system with the display of the smartphone showing the process for registering the system with the anatomical model using the fiducial; and FIG. 2B is a screenshot of the augmented reality application showing the MOVE and ROTATE sliders used to register the system with the anatomical model.
  • FIG. 3A is an image of the needle insertion guidance system with the display of the smartphone showing the user planning the planned trajectory of the medical needle in three-dimensional virtual space with the projected needle
  • FIG. 3B is an image of the needle insertion guidance system showing the user moving the smartphone around when planning the planned trajectory using a projected medical needle displayed in 3D space on the smartphone by the augmented reality application
  • FIG. 3C is a screenshot of the augmented reality application shown on the smartphone displaying the menu and the planned trajectory of a projected medical needle being planned as the user rotates the smartphone to adjust the planned trajectory to the target.
  • FIG. 4 is an image of the needle insertion guidance system showing the procedure for inserting the medical needle along the needle guide structure coupled to the smartphone for insertion into the anatomical model.
  • FIG. 5 is an image of the needle insertion guidance system showing the medical needle being detached from the needle guide structure.
  • FIGS. 6A and 6B are images of the user verifying the planned trajectory displayed on the smartphone by the augmented reality application from different angles relative to the anatomical model once the medical needle is in contact at the entry point.
  • FIG. 7A is a screenshot of the augmented reality application showing the menu for adjusting depth once the medical needle has been inserted through the entry point and into the anatomical model along the planned trajectory; and
  • FIG. 7B is an image of the user adjusting the depth using the augmented reality application.
  • FIG. 8 is an image of a user employing an “entry point method” by first touching the entry point with the tip of the medical needle,
  • FIG. 9 is an image of the user aligning the virtual entry point with the target to define the planned trajectory in the three-dimensional space using the “entry point method”
  • FIG. 10 is an image of the user defining the depth of the pathway to the target in the three-dimensional space using the “entry point method”.
  • FIG. 11 is an image of the user placing the tip of the medical needle at the entry point and then aligning the medical needle with the planned trajectory displayed by the augmented reality application on the smartphone using the entry point method.
  • FIG. 12 is a flow chart illustrating the workflow for the quick needle placement method.
  • FIG. 13 is a flow chart illustrating the workflow for the entry point method.
  • FIG. 14 is a screenshot displayed on the smartphone by the augmented reality application used for automatic retracking and manual adjustment functionalities.
  • FIG. 15 is a flow chart illustrating a quick needle placement method for a needle trajectory plan and insertion shown in FIG. 12.
  • FIG. 16 is a flow chart illustrating the entry point method for a separate needle trajectory plan and guidance of FIG. 13.
  • FIGS. 20A-20C are screenshots of the augmented reality application in which the user plans the planned trajectory of the medical needle and then approaches the medical needle to the entry point before locking in the planned trajectory of the medical needle when employing the quick needle placement method.
  • FIG. 21 is a screenshot of the augmented reality application in which the user inserts the medical needle when employing the quick needle placement method.
  • FIG. 22 is an image of the needle insertion guidance system showing the user unlocking the medical needle from the needle guide structure when employing the quick needle placement method.
  • FIG. 23 is a screenshot of the augmented reality application in which the user adjusts the depth of the medical needle when employing the quick needle placement method.
  • FIG. 24A is a screenshot of the augmented reality application in which the user registers the digital organs in the three-dimensional virtual space with the fiducial when employing the entry point method; and FIG. 24B is a screenshot of the augmented reality application in which the user selects the entry point when employing the entry point method.
  • FIGS. 25A and 25B are screenshots of the augmented reality application in which the user plans the planned trajectory of the medical needle when employing the entry point method.
  • FIGS. 26A and 26B are screenshots of the augmented reality application in which the user aligns the medical needle with the planned trajectory and then inserts the medical needle when employing the entry point method.
  • FIG. 27 is a screenshot of the augmented reality application in which the user unlocks the needle from the guide and then adjusts the depth of the medical needle when employing the entry point method.
  • FIG. 28 is an illustration showing the accuracy metrics on a post procedural computer tomography that is evaluated by measuring the distance between the needle tip to the center of the target (A), the shortest distance from the needle shaft to the center of the target (B) and angular error (C).
  • FIG. 29 is a graphical representation of the standardization with use of the augmented reality guidance system used with the smartphone shown with comparative accuracy of the needle placement using freehand vs. smartphone using the augmented reality guidance system.
  • FIG. 30 is an illustration showing an exemplary computing system for effectuating the functionalities of the augmented reality application.
  • FIGS. 31 A and 31 B are lateral and oblique images, respectively, of the smartphone with the augmented reality guidance system having a needle guidance structure coupled to a medical needle.
  • FIG. 32A is an image showing the registration of the anatomical model with the fiducial reference box; and FIG. 32B is a snapshot of the screen of the smartphone showing the three-dimensional virtual space deployed by the augmented reality application.
  • FIG. 33A is an image that shows the location of the entry point on the anatomical model in the 3D virtual space being recorded by touching the entry point with the tip of the medical needle;
  • FIG. 33B is a snapshot of the screen showing the view generated by the augmented reality application for planning the planned trajectory of the medical needle defined by aligning the entry point (blue dot) with the virtual target (red sphere), at the center of the screen (green cross);
  • FIG. 33C is a snapshot of the screen showing the view generated by the augmented reality application for illustrating the adjustment of the depth of the planned needle path by sliding it (white arrow) untiled it reaches the virtual target.
  • FIG. 34A-34C are images of the 3D virtual space shown on the screen of the smartphone that illustrates the real time feedback of the needle angle (blue needle on the screen) compared to the planned needle path (green needle on the screen).
  • Corresponding reference characters indicate corresponding elements among the view of the drawings. The headings used in the figures do not limit the scope of the claims.
  • an augmented reality guidance system for needle insertion guidance during percutaneous intervention using an image analysis and visualization augmented reality application deployed on a smartphone are disclosed herein.
  • the smartphone is coupled to a needle guide structure for holding and guiding a medical needle with the smartphone being operable for executing the augmented reality application to track and display the position of the smartphone and the medical needle in three-dimensional space relative to a digital target along a planned trajectory and depth displayed of the medical needle on the smartphone.
  • a quick needle placement method may be employed that generates a needle trajectory plan and insertion in a simplified procedure in which the planned trajectory and the interaction of the medical needle with digital organs are illustrated in a three-dimensional virtual space displayed on a screen of the smartphone by the augmented reality application to show the user the planned trajectory and depth of the medical needle to the target.
  • an entry point method may also be employed in which the user first plans the planned trajectory and planned depth for the medical needle at the patient’s bed side using the augmented reality guidance system. In this method, the operator follows the planned trajectory and depth generated by the augmented reality application and inserts the medical needle within an anatomical model or patient.
  • the planned trajectory and planned depth are displayed on the smartphone for visual reference that allows for real-time feedback as the user aligns a projected needle with the planned trajectory during insertion to a planned depth within the patient or anatomical model.
  • both methods employ a registration process that allows the user to use a fiducial as a 3D reference marker to register the three-dimensional virtual space of the augmented reality application of the smartphone relative to the patient or anatomical model. Further details of the augmented reality guidance system are discussed in greater detail below.
  • the augmented reality guidance system 100 includes an augmented reality application 102 operable on a smartphone 10 for generating a planned path or trajectory 108 and planned depth 106 when inserting a medical needle 16 in relation to digital organs 124 (FIG. 3B), such as the internal organs and other anatomical features of a patent or anatomical model 12 displayed in a three-dimensional virtual space 114 on the smartphone 10.
  • digital organs 124 FIG. 3B
  • the three-dimensional virtual space 114 is generated by the augmented reality application 102 that is overlaid on real-time images 116 of the medical needle 16 received from a camera 117 of the smartphone 10 and displayed on a screen 112 of the smartphone 10.
  • the smartphone 10 includes one or more sensors, for example a gyro and an inertial measurement unit, for providing the augmented reality application 102 with positional information about the position and orientation of the smartphone 10 and medical needle 16.
  • a slider 110 is displayed on the screen 112 by the augmented reality application 102 that is operable for transitioning the screen 112 of the smartphone 10 from displaying only the three-dimensional virtual space 114 to displaying some degree of transparency between the three- dimensional virtual space 114 overlaid with the real-time images 116 provided by the camera 117 of the smartphone 10 to displaying only the real-time images 116 from the camera of the smartphone 10 as the slider 110 is moved laterally from right to left along the screen 112 or vice versa.
  • the anatomical model 12 may be an artificial anatomical model 12 that is constructed to imitate the human anatomy for training.
  • the augmented reality guidance system 100 may be used on a patient when performing a medical procedure.
  • a fiducial 14 is provided and positioned alongside the anatomical model 12 (or patient) that acts as a 3D reference marker for registering the augmented reality application 102 of the smartphone 10 with the anatomical model 12 or patient before initiating the medical procedure as shall be discussed in greater detail below.
  • the process flows 200, 300, and 400 of the augmented reality guidance system 100 illustrate the various respective methods employed by the user to perform to perform needle insertion guidance and are each divided into a pre-operational phase and an intra-operational phase.
  • the pre-operational phase 3D scans of the patient are obtained and then segmented before being communicated to the augmented reality application 102 which generates the digital organs 124 (e.g., skin, organs, lesions, etc.) of the anatomy of an anatomical model 12 (or patient) in constructing the three-dimensional virtual space 114 of the augmented reality application 102.
  • the registration, planning, and guidance steps are undertaken by the user to provide needle insertion guidance based on the various process flows 200, 300 and 400 described below.
  • FIG. 12 One method for needle insertion guidance, referred to as a quick needle placement method, is described in the workflow 500 shown in FIG. 12 and the process flow 200 of FIG. 15.
  • the user registers the digital organs 124 in the three-dimensional virtual space 114 with the fiducial 14 as shown the screenshot in FIG. 18A.
  • FIG. 18B shows an image of the anatomical model 12 and fiducial 14 before the registration process
  • FIG. 18C is a screenshot of the augmented reality application 102 once the registration process has been completed after the retracking option 125 and lock option 150 have been actuated.
  • the user calibrates the three- dimensional virtual space 114 by placing the medical needle 16 in the needle guide structure 104 and then verifies calibration as shown in FIGS. 19A in which the user actuates the spawn option 121 to activate and manipulate the virtual needle 115 with respect to the digital organs 124 in the three-dimensional virtual space 114 displayed on screen 112.
  • the user actuates the define path option 129 and aligns the projected maximum needle path by superposing the medical needle 16 imaged by the camera 117 with the virtual needle 115.
  • FIG. 2A shows the screen 112 of the smartphone 10 during the registration process showing the fiducial 14 and anatomical model 12 in the background.
  • the user plans the planned trajectory 108 of the medical needle 16 shown in FIGS.
  • the user adjusts the planned depth 106 of the medical needle 16 within the anatomical model 12 or patient by actuating the depth slider 113 (FIG. 23B) displayed on screen 112 as shown in FIGS. 7A, 7B, and 23.
  • the process flow 200 proceeds back to block 210 for the user to manually fine-tune the registration again.
  • decision block 212 if the registration is not off then the process flow 200 enters the plan and guidance portion of the intra-operational phase described below.
  • the process flow 200 returns to block 208 for the user to again spawn the digital organs 124 close to the patient using the fiducial 14. If the registration is not severely off, then the process flow 200 returns to block 210 for the user to register the digital organs 124 again.
  • the process flow 200 proceeds to decision block 230 for the user to determine whether the registration is off. If the registration is off, the process flow 200 proceeds to decision block 232 to determine whether the registration is severely off. If the registration is severely off, the process flow 200 proceeds back to block 208 and if the registration is not severely off, the process flow 200 proceeds back to block 210 for the user to manually fine-tune the registration again.
  • FIG. 13 Another method for needle insertion guidance in which the user chooses the “entry point method” described in the workflow 600 shown in FIG. 13 and the process flow 300 of FIG. 16.
  • the user registers the digital organs 124 in the three-dimensional virtual space 114 with the fiducial 14.
  • the user actuates the retracking option 125 which allows for automatic retracking by the augmented reality application 102 or a manual adjustment option where the user conducts manual registration.
  • the user calibrates the three-dimensional space 114 by placing the medical needle 16 in the needle guide structure 18 and verifies calibration.
  • FIG. 13 Another method for needle insertion guidance in which the user chooses the “entry point method” described in the workflow 600 shown in FIG. 13 and the process flow 300 of FIG. 16.
  • the user registers the digital organs 124 in the three-dimensional virtual space 114 with the fiducial 14.
  • the user actuates the retracking option 125 which allows for automatic retracking by the augmented reality application 102 or a manual adjustment option where the
  • FIGS. 24A and 24B show the screen 112 of the smartphone 10 during the registration process showing the fiducial 14 and anatomical model 12 in the background.
  • the user chooses the entry point 134 for the medical needle 16 as illustrated in the screenshots shown in FIGS. 24A and 24B in which the user defines the entry point 134.
  • the user actuates the “skin entry” option 126 to produce a blue dot 137 at the entry point 134.
  • the user plans the planned trajectory 108 and planned depth 106 of the medical needle 16 in the three-dimensional virtual space 114 as illustrated in the screenshots of FIGS.
  • the user aligns the digital target 132 (illustrated as the red dot) with the entry point 134 (illustrated as a blue dot 137) with the digital cross sign 133 to plan the planned depth 106 and planned trajectory 108 of the medical needle 16.
  • the user manually aligns the medical needle 16 along the planned trajectory 108 in which visual feedback of the alignment of the medical needle 16 compared with the planned trajectory 108 is illustrated as a big red that transitions to a small green circle is shown.
  • the user inserts the medical needle 16 into the anatomical model 12 or patient.
  • the user decouples the medical needle 16 from the needle guide structure 104.
  • FIG. 16 shows the process flow 300 for the “entry point method” of needle insertion guidance described above in relation to workflow 600 illustrated in FIG. 13.
  • 3D scans are obtained, for example CT scans at block 302.
  • these 3D scans are segmented and at block 306 the digital organs 124 from the 3D scans are manually imported into the three-dimensional virtual space 114 generated by the augmented reality application 102.
  • the intra-operational phase is initiated by user conducting the registration process described above in which the user performs registration between the digital organs 124 and the fiducial 14 that acts as a 3D reference marker.
  • digital organs 124 are manually imported in the three-dimensional virtual space 114 and the user spawns these digital organs 124 close to the anatomical model 12 using the fiducial 14.
  • the plan portion of process flow 300 is initiated at block 316, wherein the user defines an entry point 134 along the skin of the anatomical model 12 or patient by first placing the medical needle 16 at an initial clip position such that the medical needle 16 is inserted into the needle guide structure 104 and the tip 18 of the medical needle 16 is overlapped with the projected needle tip 119 (FIG. 19C) of the projected medical needle 118 displayed in the three-dimensional virtual space 114 on the screen 112 of the smartphone 10. Once entry point 134 is defined, the process flow 300 proceeds to decision block 318 for the user determine whether the registration is off or not.
  • the process flow 300 proceeds to decision block 320 for the user to determine whether the registration is severely off or not. If the registration is severely off the process flow 300 proceeds back to block 308 for the user to again spawn the digital organs 124 close to the anatomical model 12 or patient using the fiducial 14. If the registration is not severely off, the process flow 300 proceeds back to block 310 for the user to again register the digital organs 124 to the patient or anatomical model 12. [0065] Once the planned depth 106 is defined at block 330, the process flow 300 proceeds to decision block 334 for the user to determine if the registration is off or not.
  • the process flow 300 proceeds to decision block 332 for the user to determine if the registration is severely off or not. If the registration is severely off, the process flow 300 proceeds back to block 308 for the user to again spawn the digital organs 124 close to the patient using the fiducial 14. If the registration is not severely off, the process flow 300 proceeds back to block 310 for the user to again register the digital organs 124 to the patient or anatomical model 12.
  • the process flow 300 proceeds to block 336 for the user to display the planned trajectory 108, the needle advance path, and the real-time discrepancy of the planned trajectory 108 and the needle path in the three-dimensional virtual space 114 as well as the initial needle insertion guidance.
  • the needle advance path represents the planned trajectory 108 when the medical needle 16 is fully inserted into the needle guide structure 104 as shown in FIG. 11.
  • the user aligns and inserts the medical needle 16 through the needle guide structure 104 and into the anatomical model 12 or patient based on the above needle insertion guidance.
  • the process flow 300 proceeds to decision block 338 for the user to determine whether the registration is off or not.
  • the process flow 300 proceeds to decision block 340 for the user to determine whether the registration is severely off or not. If the registration is severely off, the process flow proceeds back to block 308 for the user to again spawn the digital organs 124 close to the patient or anatomical model 12 using the fiducial 14. If the registration is not severely off, the process flow 300 proceeds back to block 310 for the user to again register the digital object 124 to the patient or anatomical model 12.
  • the process flow 300 proceeds to block 342 for the user to display the needle insertion guidance for the medical needle 16 once the needle 16 is detached.
  • the process flow 300 displays the planned trajectory 114 for the user to further insert and adjust the medical needle 16 in which the user detaches the medical needle 16 from the needle guide structure 104 as shown in FIG. 5, adjusts the planned trajectory 108 by comparing the real time image 116 of the medical needle 16, and the planned trajectory 108 displayed from different viewing angles on the screen 112 of the smartphone 10 as shown in FIGS. 6A and 6B.
  • the process flow 300 proceeds to decision block 344 for the user to determine whether the registration if off or not.
  • the process flow 300 proceeds to decision block 346 for the user to determine whether the registration is severely off or not. If the registration is severely off, the process flow 300 proceeds back to block 308 for the user to again spawn the digital organs 124 close to the patient or anatomical model 12 using the fiducial 14. If the registration is not severely off, the process flow 300 proceeds back to block 310 to again register the digital organs 124 to the patient or anatomical model 12.
  • FIG. 17 shows the process flow 400 for a method of needle insertion guidance that combines aspects of both the entry point method and the quick needle placement method described above.
  • the 3D scans are obtained, for example CT scans, at block 402.
  • these 3D scans are segmented and at block 406 the digital organs from the 3D scans manually imported into the three-dimensional virtual space 114 generated by the augmented reality application 102.
  • the intra-operational phase is initiated by conducting the registration process described above in which the user performs registration between the digital organs 124 and the fiducial 14.
  • the digital organs 124 are manually imported in the three-dimensional virtual space 114 and the user spawn these digital organs 124 close to the patient or anatomical model 12 using a fiducial 14.
  • the user fine-tunes the registration of the digital organs 124 with the fiducial 14 by using sliders 122 (FIG. 2B) displayed in the screen 112 of the smartphone 10 to move and/or translate the three-dimensional virtual space 114.
  • the user determines whether the registration between the fiducial 14 and the digital organs 124 is off. If the registration is off, then at decision block 414 the user determines whether the registration is severely off.
  • the process flow 400 returns to block 408 for the user to again spawn the digital organs 124 close to the patient using the fiducial 14. If the registration is not severely off, the process flow 400 returns to block 410 for the user to fine-tune the registration again manually.
  • the process flow 400 enters the planned portion of the intra-operational phase. [0069] In the planned portion of process flow 400 initiated at block 416, the user defines and initiates the planned trajectory 108 by simultaneously defining the planned trajectory 108 and inserting the medical needle 16 into the patient or anatomical model 12.
  • the user places the medical needle 16 in the initial clip position in which the medical needle 16 is inserted into the needle guide structure 104 and the tip 18 of the medical needle 16 is then overlapped with the projected needle 118 displayed in the three-dimensional virtual space 114 on the screen 112.
  • the smartphone 10 is maneuvered by the user until the needle advance path (e.g., the planned trajectory 108 when the medical needle 16 is fully inserted into the needle guide structure 104) that is displayed on screen 112 passes through the center of the digital target 132 illustrated in the three-dimensional virtual space 114.
  • the process flow 400 proceeds to decision block 418 for the user to determine whether the registration is severely off or not.
  • the process flow 400 returns to block 408 for the user to again spawn the digital organs 124 close to the patient using the fiducial 14. If the registration is not severely off, the process flow 400 returns to block 410 for the user to fine-tune the registration again manually. At decision block 412, if the registration is not off, the process flow 400 proceeds to decision block 422 for the user to determine whether the planned trajectory 108 is safe or not.
  • the process flow 400 proceeds to block 438 for the user to initiate the planned trajectory 108 and evaluate the target depth using the graduation rings 123 defined along the length of the planned depth 106 shown in the three-dimensional virtual space 114 and then the user partially inserts the medical needle 16 into the patient or anatomical model 12 in which the estimated target depth may be, for example, 1 cm.
  • the process flow 400 proceeds to decision block 440 for the user to determine whether the registration is off or not. If the registration is off, the process flow proceeds to decision block 442 for the user to determine whether the registration is severely off or not. If the registration is severely off, the process flow 400 returns to block 408 for the user to again spawn the digital organs 124 close to the patient using the fiducial 14. If the registration is not severely off, the process flow 400 returns to block 410 for the user to fine-tune the registration again manually. At decision block 440, if the registration is not off, the process flow 400 proceeds to decision block 444 for the user to determine whether the planned depth 106 is defined or not.
  • the process flow 400 proceeds directly to block 452 for the user to adjust the planned trajectory 108 displayed in the three-dimensional virtual space 114 and to also adjust the planned depth 106 displayed in the three-dimensional virtual space 114 by comparing the overlaid realtime image 116 of the medical needle 16 with the planned trajectory 108 displayed on the screen 112.
  • decision block 454 for the user to determine if the registration is off. If the registration is off, the process flow 400 proceeds to decision block 456 for the user to determine whether the registration is severely off. If the registration is severely off, the process flow 400 returns to block 408 for the user to again spawn the digital organs close to the patient using the fiducial 14. If the registration is not severely off, the process flow 400 returns to block 410 for the user to fine-tune the registration again manually.
  • the process flow 400 proceeds to block 446 to define the planned depth 106.
  • the planned depth 106 is fine-tuned by detaching the medical needle 16 from the needle guide structure 104, moving the smartphone 10 away from the patient or anatomical model 12 at the center of the digital target 132 by using the slider 110 displayed on the screen 112.
  • the process flow 400 proceeds to decision block 448 for the user to determine whether the registration is off or not. If the registration is off, the process flow proceeds to decision block 450 for the user to determine whether the registration is severely off or not. If the registration is severely off, the process flow 400 returns to block 408 for the user to again spawn the digital organs 124 close to the patient or anatomical model 12 using the fiducial 14. If the registration is not severely off, the process flow 400 returns to block 410 for the user to fine-tune the registration again manually. At decision block 448, if the registration is off, the process flow 400 proceeds to block 452 for the user to adjust the planned trajectory 108 as described above.
  • the user may optionally proceed to block 424 to verify the planned trajectory 108 in the three-dimensional virtual space 114 by moving the smartphone 10 around the planned trajectory 108. Once verified, the process flow 400 proceeds to decision block 426 for the user to determine whether the verified planned trajectory 108 is safe. If the verified planned trajectory 108 is not safe, the process flow 400 proceeds back to block 416 for the user to again define and initiate the planned trajectory 108. At decision block 422, if the user is sure that that planned trajectory 108 is safe, then the process flow 400 proceeds to block 428 for the user to fine-tune the planned depth 106 of the planned trajectory 108.
  • the process flow 400 proceeds to decision block 430 for the user to determine whether the registration is off or not. If the registration is off, the process flow proceeds to decision block 450 for the user to determine whether the registration is severely off or not. If the registration is severely off, the process flow 400 returns to block 408 for the user to again spawn the digital organs 124 close to the patient or anatomical model 12 using the fiducial 14. If the registration is not severely off, the process flow 400 returns to block 410 for the user to fine-tune the registration again manually. At decision block 430, if the registration is not off, then the process flow 400 proceeds to block 435 for the user to align the medical needle 16 with the planned trajectory 108.
  • the process flow 400 displays on the screen 112 the planned trajectory 108, the needle advance plan, and the real-time discrepancy between the planned trajectory 108 and the needle advance plan in the three- dimensional virtual space 114.
  • the initial needle insertion guidance is also displayed.
  • the user aligns the medical needle 16 which is inserted through the needle guide structure 104 using the initial needle insertion guidance.
  • the process flow 400 proceeds to decision block 434 for the user to determine whether the registration is off or not. If the registration is off, the process flow 400 proceeds to decision block 436 for the user to determine whether the registration is severely off or not. If the registration is severely off, the process flow 400 returns to block 408 for the user to again spawn the digital organs 124 close to the patient or anatomical model 12 using the fiducial 14. If the registration is not severely off, the process flow 400 returns to block 410 for the user to fine-tune the registration again manually. At decision block 434, if the registration is not off, then the process flow 400 proceeds to block 438 for the user to initiate the planned trajectory 108.
  • FIG. 30 is a schematic block diagram of an example device 10, such as smartphone 10, that may be used with one or more embodiments described herein, e.g., as a component of the augmented reality guidance system 100.
  • the example device 10 comprises one or more network interfaces 710 (e.g., wired, wireless, PLC, etc.), at least one processor 720, and a memory 740 interconnected by a system bus 750, as well as a power supply 760 (e.g., battery, plug-in, etc.).
  • Network interface(s) 710 include the mechanical, electrical, and signaling circuitry for communicating data over the communication links coupled to a communication network.
  • Network interfaces 710 are configured to transmit and/or receive data using a variety of different communication protocols. As illustrated, the box representing network interfaces 710 is shown for simplicity, and it is appreciated that such interfaces may represent different types of network connections such as wireless and wired (physical) connections.
  • Network interfaces 710 are shown separately from power supply 760, however it is appreciated that the interfaces that support PLC protocols may communicate through power supply 760 and/or may be an integral component coupled to power supply 760.
  • Memory 740 includes a plurality of storage locations that are addressable by processor 720 and network interfaces 710 for storing software programs and data structures associated with the embodiments described herein.
  • the example device 10 may have limited memory or no memory (e.g., no memory for storage other than for programs/processes operating on the device and associated caches).
  • Processor 720 comprises hardware elements or logic adapted to execute the software programs (e.g., instructions) and manipulate data structures 745.
  • An operating system 742 portions of which are typically resident in memory 740 and executed by the processor, functionally organizes example device 10 by, inter alia, invoking operations in support of software processes and/or services executing on the device.
  • These software processes and/or services may include augmented reality application processes/services 790 described herein. Note that while augmented reality application processes/services 790 is illustrated in centralized memory 740, alternative embodiments provide for the process to be operated within the network interfaces 710, such as a component of a MAC layer, and/or as part of a distributed computing network environment.
  • modules or engines configured to operate in accordance with the techniques herein (e.g., according to the functionality of a similar process).
  • module and engine may be interchangeable.
  • the term module or engine refers to model or an organization of interrelated software components/functions.
  • the augmented reality application processes/services 790 is shown as a standalone process, those skilled in the art will appreciate that this process may be executed as a routine or module within other processes.
  • a smartphone 10 cover for iPhone 14 Pro with an integrated needle guide structure 104 (Verza needle guide, Civco) was designed and 3D-printed (UltiMaker S3, Ultimaker B.V.) for effecting the functionalities of the augmented reality guidance system 100.
  • This needle guide structure 104 design was chosen because of its ability to hold different sizes of medical needles 16, easily modify the needle angle, and quickly detach the medical needle 16 from the needle guide structure 104 after placement.
  • An augmented reality application 102 for percutaneous interventions was developed on Unity.
  • the fixed needle guide structure 104 enables the projected needle path to be virtually implemented into the augmented reality application 102.
  • the augmented reality application 102 used a smartphone camera 117, embedded gyroscope, and IMU sensors to register the pre-operative CT scans with the target body and track the location of the smartphone 10 relative to the patient or phantom in real time. This enabled the operator to plan the needle path at the bedside of the patient and provided needle alignment feedback from multiple viewing angles, using both visual and gyroscopic confirmation. Phantom experiment
  • the abdominal phantom CT was segmented and modeled using an open-source software (3D slicer, https://www.slicer.org/).
  • CT slice thickness: 0.8 mm, voxel size: 512 x 512 x 467
  • Philips IQon Spectral CT Best, The Netherlands
  • Registration of the 3D model on the phantom was made using a fiducial box 14 included in the preprocedural CT (FIGS. 32A and 32B), as previously described.
  • a fiducial box 14 included in the preprocedural CT (FIGS. 32A and 32B), as previously described.
  • Prior to the experiment all operators had a training session that consisted of 1 to 4 needle placement using augmented reality guidance system 100, until the operator felt confident.
  • the median distance from the needle shaft to the center of the target 132 was 4 mm [JQR 2-5 mm] using the smartphone 10 versus 13 mm [JQR 7-22 mm] for freehand (p ⁇ 0.001). Results are detailed in Table 1 and FIG. 29.
  • the user can hold both the medical needle 16 and smartphone 10 at the same time.
  • the fixed path acts as a connector from the real world into the augmented reality world, enabling new possibilities and workflows with the augmented reality application 102.
  • the needle guide structure 104 on the smartphone 10 allows planning the path/trajectory on the augmented reality application 102, at the patient’s bedside, via the smartphone’s IMU and gyroscope to track the position and orientation of the medical needle 16.
  • the novel smartphone 10 iteration provides the operator more robust “sensors-based” real time feedback. In this phantom study, planning the path on the augmented reality application 102 was feasible and rapid, with a median time of 91 seconds, while the insertion itself took 68 seconds.
  • the augmented reality application 102 required the upload and segmentation of the 3D model of the phantom from preprocedural CT images, which can be time-consuming, without semi-automatic algorithms. Hence deep learningbased segmentation software will help with this task in the near future. However, further refinements and research and development are still needed to fully investigate the potential impact of smartphone AR technology in clinical practice.
  • This augmented reality application 102 deployed on a smartphone 10 with an integrated needle guide structure 104 was found to be a novel low-resource guidance tool that allows path planning at the patient’s bedside and accurate needle placement (4 mm error) with real time angular feedback in less than 3 minutes, regardless of operator experience.
  • This technology does not require extensive training, may reduce the influence of experience on accuracy of percutaneous interventions, and provides a novel way to unite the mixed reality world with IR without goggles and without coordinating 2 hands.

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biomedical Technology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Pathology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Robotics (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

Various embodiments of an augmented reality guidance system for needle insertion guidance using an image analysis and visualization augmented reality application deployed on a smartphone is disclosed herein. The smartphone is coupled to a needle guide structure for holding and guiding a medical needle and is operable for executing the augmented reality application to track and display the position of the smartphone and the medical needle in three-dimensional space of the augmented reality application relative to a target along a planned trajectory.

Description

SYSTEMS AND METHODS FOR AN AUGMENTED REALITY GUIDANCE SYSTEM
FIELD
[0001] The present disclosure generally relates an augmented reality guidance system and in particular an augmented reality guidance system for needle insertion guidance using an image analysis and visualization augmented reality application deployed on a smartphone that is coupled to a needle guide structure engaged to a medical needle to track and display the position of the smartphone and the medical needle in three-dimensional virtual space relative to the position of a digital target along a planned trajectory.
BACKGROUND
[0002] Safe and effective diagnostic and therapeutic outcomes in interventional radiology are dependent on accurate percutaneous needle placement. Needle placement accuracy is affected by preprocedural planning as well as intraoperative navigation, which is dependent on intraprocedural image guidance. The positioning of the needle largely depends on the physician’s visuospatial ability and hand-to-eye coordination, which are learned, variable, and a combination of innate and acquired skills. This variability contributes toward the lack of standardization of percutaneous needle interventions. A variety of needle guidance systems employing different technologies, such as electromagnetic tracking, laser, and robotics, have been developed to improve upon conventional or manual methods of needle placement. However, such tools have not been widely adopted due to various limitations including cost, availability, ergonomics, and additional procedural time or workflow complexity. Although speculative, augmented reality (AR) tools could help reduce variabilities in skill levels and techniques and alleviate some of the limitations of standard needle guidance technologies.
[0003] AR describes a technologically enhanced version of reality that superimposes digital information onto the real world through stationary, handheld, or head-mounted devices. Handheld AR devices, including the smartphone and smart tablet, enable increased mobility compared to stationary devices. The recent development of the head-mounted smartglass further increases mobility by anchoring digital 3D objects in physical space rather than overlaying them on a 2D screen. Compared to CT image guidance on a peripheral 2D monitor, AR can enable the interventional radiologist to have improved localization of the patient anatomy and a more intuitive understanding of the treatment plan within the direct line of sight of the patient at the bedside. While handheld and head-mounted applications of AR for percutaneous needle navigation guidance have been described, a direct comparison of the needle placement performance between both types of AR devices remains undefined in the current literature.
[0004] It is with these observations in mind, among others, that various aspects of the present disclosure were conceived and developed.
SUMMARY
[0005] A needle insertion guidance system includes an anatomical model, the anatomical model having an outer surface defining an internal space having representations of internal organs disposed therein; and a smartphone in visual communication with the anatomical model, the smartphone being coupled to a medical needle through a needle guide structure engaged to the smartphone, the smartphone having a camera for communicating real-time images to a display of the smartphone, the display being in operative communication with processor, wherein the processor is in communication with a memory, the memory including instructions executable by the processor to: execute, at the smartphone, an augmented reality application for providing an augmented reality representation in a three-dimensional virtual space of the anatomical model having digital organs that correspond to the internal organs of the anatomical model, generate, at the three-dimensional virtual space, a planned trajectory and a planned depth for the medical needle, overlaying, by the augmented reality application, the real-time images of the medical needle with the three-dimensional virtual space of the anatomical model, and displaying, in the three-dimensional virtual space, an orientation of the medical needle relative to a planned trajectory and planned depth within the anatomical model for providing visual feedback to a user.
[0006] In one aspect, the memory further includes instructions executable by the processor to: calibrate the three-dimensional virtual space representing the anatomical model and digital organs with the medical needle coupled to the needle guide structure engaged to the smartphone, spawn a virtual needle in the three-dimensional virtual space, and manipulate the virtual needle in the three-dimensional virtual space with respect to the digital organs displayed on the smartphone, wherein the planned trajectory of the medical needle is defined in the three-dimensional virtual space by superposing the medical needle with the virtual needle.
[0007] In a further aspect, the augmented reality application is operable for providing visual feedback of the orientation and position of the medical needle relative to the planned trajectory in the three-dimensional virtual space displayed on the smartphone. The augmented reality application is also operable for displaying the planned depth of the medical needle along the planned trajectory to a virtual target that is overlaid on real-time images of the medical needle and an entry point defined along the anatomical model, wherein augmented reality application is operable for displaying the planned depth defined by a plurality of graduations denoted along the planned trajectory.
[0008] In one embodiment, the needle guide structure is positioned along the smartphone such that the medical needle is visible on a camera of the smartphone to ensure visibility of the medical needle on the display of the smartphone, wherein the needle guide structure fixes in position the medical needle when coupled to the needle guide structure relative to the planned trajectory defined in the three-dimensional virtual space displayed on the smartphone.
[0009] In addition, the smartphone includes one or more sensors for providing real-time tracking of the position and orientation of the smartphone including the position and orientation of the medical needle coupled to the smartphone relative to the anatomical model in the three-dimensional virtual space displayed on the smartphone. The augmented reality application is also operable for planning the planned trajectory of the medical needle between an entry point defined along the anatomical model and a virtual target within the anatomical model in the three-dimensional virtual space displayed on the smartphone.
[0010] The augmented reality application is operable for planning the planned trajectory of the medical needle by aligning a projected needle with a digital target in the three-dimensional virtual space displayed on the smartphone.
[0011] In another aspect, the needle guidance system includes a fiducial in visual communication with the smartphone and the anatomical model for calibrating the three-dimensional virtual space with the medical needle, wherein the augmented reality application is operable to provide registration of the actual path of the medical needle with the planned trajectory of the medical needle in the three- dimensional virtual space displayed on the smartphone.
[0012] The augmented reality application is also operable for generating a virtual target to establish the planned trajectory and the planned depth by the medical needle.
[0013] In yet another aspect, the display of the digital organs within the anatomical model by the augmented reality application is generated by inputting three-dimensional scans of the anatomical model to the augmented reality application and segmenting the inputted three-dimensional scans to generate the three-dimensional virtual space displayed on the smartphone.
[0014] In one aspect, a method for needle insertion guidance includes registering an augmented reality application operable on a smartphone relative to a target location within an anatomical model using a fiducial positioned outside the anatomical model by moving and rotating a three-dimensional virtual space through the augmented reality application and displayed on the smartphone, planning a planned trajectory of a medical needle coupled to the smartphone through a needle guide structure in the three-dimensional virtual space from an entry point defined along the anatomical model to the target location within the anatomical model in the three-dimensional virtual space, planning a planned depth of the medical needle in the three-dimensional virtual space from the entry point to the target location in the three-dimensional virtual space, and adjusting the planned depth of the medical needle along the planned trajectory relative to the target location in the three- dimensional virtual space, wherein planning the planned trajectory of the medical needle includes aligning a projected needle of the medical needle with a virtual needle generated by the augmented reality application in the three-dimensional virtual space and then aligning the virtual needle with a virtual target defined at the target location in the three-dimensional virtual space, and wherein planning the planned depth for insertion of the medical needle comprises aligning the digital target with the entry point with a digital cross sign of the virtual needle.
[0015] In a further aspect, the augmented reality application is operable for changing the color of the planned trajectory from one color to another color when aligning the medical needle with the planned trajectory.
[0016] In yet another aspect, the medical needle may be decoupled from the needle guide structure once the planned trajectory and planned depth are established within the three-dimensional virtual space, and wherein the planned depth is adjusted by aligning a tip of the virtual needle to the virtual target after the medical needle is detached from the needle guide structure.
BRIEF DESCRIPTION OF THE DRAWINGS
[0017] FIG. 1 is a simplified block diagram of a needle insertion guidance system having an augmented reality application operable on a smartphone that generates a planned trajectory and planned depth for a medical needle handled by a user for insertion into an anatomical model or patient.
[0018] FIG. 2A is an image of the needle insertion guidance system with the display of the smartphone showing the process for registering the system with the anatomical model using the fiducial; and FIG. 2B is a screenshot of the augmented reality application showing the MOVE and ROTATE sliders used to register the system with the anatomical model.
[0019] FIG. 3A is an image of the needle insertion guidance system with the display of the smartphone showing the user planning the planned trajectory of the medical needle in three-dimensional virtual space with the projected needle; FIG. 3B is an image of the needle insertion guidance system showing the user moving the smartphone around when planning the planned trajectory using a projected medical needle displayed in 3D space on the smartphone by the augmented reality application; and FIG. 3C is a screenshot of the augmented reality application shown on the smartphone displaying the menu and the planned trajectory of a projected medical needle being planned as the user rotates the smartphone to adjust the planned trajectory to the target.
[0020] FIG. 4 is an image of the needle insertion guidance system showing the procedure for inserting the medical needle along the needle guide structure coupled to the smartphone for insertion into the anatomical model.
[0021] FIG. 5 is an image of the needle insertion guidance system showing the medical needle being detached from the needle guide structure.
[0022] FIGS. 6A and 6B are images of the user verifying the planned trajectory displayed on the smartphone by the augmented reality application from different angles relative to the anatomical model once the medical needle is in contact at the entry point. [0023] FIG. 7A is a screenshot of the augmented reality application showing the menu for adjusting depth once the medical needle has been inserted through the entry point and into the anatomical model along the planned trajectory; and FIG. 7B is an image of the user adjusting the depth using the augmented reality application.
[0024] FIG. 8 is an image of a user employing an “entry point method” by first touching the entry point with the tip of the medical needle,
[0025] FIG. 9 is an image of the user aligning the virtual entry point with the target to define the planned trajectory in the three-dimensional space using the “entry point method”
[0026] FIG. 10 is an image of the user defining the depth of the pathway to the target in the three-dimensional space using the “entry point method”.
[0027] FIG. 11 is an image of the user placing the tip of the medical needle at the entry point and then aligning the medical needle with the planned trajectory displayed by the augmented reality application on the smartphone using the entry point method.
[0028] FIG. 12 is a flow chart illustrating the workflow for the quick needle placement method.
[0029] FIG. 13 is a flow chart illustrating the workflow for the entry point method.
[0030] FIG. 14 is a screenshot displayed on the smartphone by the augmented reality application used for automatic retracking and manual adjustment functionalities.
[0031] FIG. 15 is a flow chart illustrating a quick needle placement method for a needle trajectory plan and insertion shown in FIG. 12.
[0032] FIG. 16 is a flow chart illustrating the entry point method for a separate needle trajectory plan and guidance of FIG. 13.
[0033] FIG. 17 is a flow chart illustrating a needle placement method that combines the needle placement methods illustrated in FIGS. 12 and 13.
[0034] FIG. 18A is a screenshot of the augmented reality application in which the user registers the digital organs in the three-dimensional virtual space with the fiducial when employing the quick needle placement method; FIG. 18B is an image showing the anatomical model and fiducial before the registration process; and FIG. 18C is a screenshot of the augmented reality application of the anatomical model and fiducial as seen through the screen of the smartphone after the registration process has occurred.
[0035] FIG. 19A is a screenshot of the augmented reality application in which the user places the medical needle in the needle guide structure and then verifies calibration when employing the quick needle placement method; FIG. 19B is a screenshot of the augmented reality application showing the projected needle being aligned with the virtual needle when employing the quick needle placement method; and FIG. 19C is a screenshot of the augmented reality application showing the projected needle extending from the virtual needle when verifying calibration.
[0036] FIGS. 20A-20C are screenshots of the augmented reality application in which the user plans the planned trajectory of the medical needle and then approaches the medical needle to the entry point before locking in the planned trajectory of the medical needle when employing the quick needle placement method.
[0037] FIG. 21 is a screenshot of the augmented reality application in which the user inserts the medical needle when employing the quick needle placement method.
[0038] FIG. 22 is an image of the needle insertion guidance system showing the user unlocking the medical needle from the needle guide structure when employing the quick needle placement method.
[0039] FIG. 23 is a screenshot of the augmented reality application in which the user adjusts the depth of the medical needle when employing the quick needle placement method.
[0040] FIG. 24A is a screenshot of the augmented reality application in which the user registers the digital organs in the three-dimensional virtual space with the fiducial when employing the entry point method; and FIG. 24B is a screenshot of the augmented reality application in which the user selects the entry point when employing the entry point method.
[0041] FIGS. 25A and 25B are screenshots of the augmented reality application in which the user plans the planned trajectory of the medical needle when employing the entry point method.
[0042] FIGS. 26A and 26B are screenshots of the augmented reality application in which the user aligns the medical needle with the planned trajectory and then inserts the medical needle when employing the entry point method. [0043] FIG. 27 is a screenshot of the augmented reality application in which the user unlocks the needle from the guide and then adjusts the depth of the medical needle when employing the entry point method.
[0044] FIG. 28 is an illustration showing the accuracy metrics on a post procedural computer tomography that is evaluated by measuring the distance between the needle tip to the center of the target (A), the shortest distance from the needle shaft to the center of the target (B) and angular error (C).
[0045] FIG. 29 is a graphical representation of the standardization with use of the augmented reality guidance system used with the smartphone shown with comparative accuracy of the needle placement using freehand vs. smartphone using the augmented reality guidance system.
[0046] FIG. 30 is an illustration showing an exemplary computing system for effectuating the functionalities of the augmented reality application.
[0047] FIGS. 31 A and 31 B are lateral and oblique images, respectively, of the smartphone with the augmented reality guidance system having a needle guidance structure coupled to a medical needle.
[0048] FIG. 32A is an image showing the registration of the anatomical model with the fiducial reference box; and FIG. 32B is a snapshot of the screen of the smartphone showing the three-dimensional virtual space deployed by the augmented reality application.
[0049] FIG. 33A is an image that shows the location of the entry point on the anatomical model in the 3D virtual space being recorded by touching the entry point with the tip of the medical needle; FIG. 33B is a snapshot of the screen showing the view generated by the augmented reality application for planning the planned trajectory of the medical needle defined by aligning the entry point (blue dot) with the virtual target (red sphere), at the center of the screen (green cross); FIG. 33C is a snapshot of the screen showing the view generated by the augmented reality application for illustrating the adjustment of the depth of the planned needle path by sliding it (white arrow) untiled it reaches the virtual target.
[0050] FIG. 34A-34C are images of the 3D virtual space shown on the screen of the smartphone that illustrates the real time feedback of the needle angle (blue needle on the screen) compared to the planned needle path (green needle on the screen). [0051] Corresponding reference characters indicate corresponding elements among the view of the drawings. The headings used in the figures do not limit the scope of the claims.
DETAILED DESCRIPTION
[0052] Various embodiments of an augmented reality guidance system for needle insertion guidance during percutaneous intervention using an image analysis and visualization augmented reality application deployed on a smartphone are disclosed herein. In one aspect, the smartphone is coupled to a needle guide structure for holding and guiding a medical needle with the smartphone being operable for executing the augmented reality application to track and display the position of the smartphone and the medical needle in three-dimensional space relative to a digital target along a planned trajectory and depth displayed of the medical needle on the smartphone. In another aspect, a quick needle placement method may be employed that generates a needle trajectory plan and insertion in a simplified procedure in which the planned trajectory and the interaction of the medical needle with digital organs are illustrated in a three-dimensional virtual space displayed on a screen of the smartphone by the augmented reality application to show the user the planned trajectory and depth of the medical needle to the target. In another aspect, an entry point method may also be employed in which the user first plans the planned trajectory and planned depth for the medical needle at the patient’s bed side using the augmented reality guidance system. In this method, the operator follows the planned trajectory and depth generated by the augmented reality application and inserts the medical needle within an anatomical model or patient. The planned trajectory and planned depth are displayed on the smartphone for visual reference that allows for real-time feedback as the user aligns a projected needle with the planned trajectory during insertion to a planned depth within the patient or anatomical model. In further aspect, both methods employ a registration process that allows the user to use a fiducial as a 3D reference marker to register the three-dimensional virtual space of the augmented reality application of the smartphone relative to the patient or anatomical model. Further details of the augmented reality guidance system are discussed in greater detail below.
[0053] Referring to FIG. 1 , an embodiment of the augmented reality guidance system, designated 100, that is operable for needle insertion guidance during a percutaneous intervention is illustrated. The augmented reality guidance system 100 includes an augmented reality application 102 operable on a smartphone 10 for generating a planned path or trajectory 108 and planned depth 106 when inserting a medical needle 16 in relation to digital organs 124 (FIG. 3B), such as the internal organs and other anatomical features of a patent or anatomical model 12 displayed in a three-dimensional virtual space 114 on the smartphone 10. The three-dimensional virtual space 114 is generated by the augmented reality application 102 that is overlaid on real-time images 116 of the medical needle 16 received from a camera 117 of the smartphone 10 and displayed on a screen 112 of the smartphone 10. In one aspect, the smartphone 10 includes one or more sensors, for example a gyro and an inertial measurement unit, for providing the augmented reality application 102 with positional information about the position and orientation of the smartphone 10 and medical needle 16. A slider 110 is displayed on the screen 112 by the augmented reality application 102 that is operable for transitioning the screen 112 of the smartphone 10 from displaying only the three-dimensional virtual space 114 to displaying some degree of transparency between the three- dimensional virtual space 114 overlaid with the real-time images 116 provided by the camera 117 of the smartphone 10 to displaying only the real-time images 116 from the camera of the smartphone 10 as the slider 110 is moved laterally from right to left along the screen 112 or vice versa. In some embodiments, the anatomical model 12 may be an artificial anatomical model 12 that is constructed to imitate the human anatomy for training. Although the augmented reality guidance system 100 is described herein for use with the anatomical model 12, the augmented reality guidance system 100 may be used on a patient when performing a medical procedure. As shown, a fiducial 14 is provided and positioned alongside the anatomical model 12 (or patient) that acts as a 3D reference marker for registering the augmented reality application 102 of the smartphone 10 with the anatomical model 12 or patient before initiating the medical procedure as shall be discussed in greater detail below.
[0054] As shown in FIGS. 15-17, the process flows 200, 300, and 400 of the augmented reality guidance system 100 illustrate the various respective methods employed by the user to perform to perform needle insertion guidance and are each divided into a pre-operational phase and an intra-operational phase. In the pre-operational phase, 3D scans of the patient are obtained and then segmented before being communicated to the augmented reality application 102 which generates the digital organs 124 (e.g., skin, organs, lesions, etc.) of the anatomy of an anatomical model 12 (or patient) in constructing the three-dimensional virtual space 114 of the augmented reality application 102. In the intra-operational phase, the registration, planning, and guidance steps are undertaken by the user to provide needle insertion guidance based on the various process flows 200, 300 and 400 described below.
[0055] One method for needle insertion guidance, referred to as a quick needle placement method, is described in the workflow 500 shown in FIG. 12 and the process flow 200 of FIG. 15. Referring specifically to workflow 500 of FIG. 12, at block 502, the user registers the digital organs 124 in the three-dimensional virtual space 114 with the fiducial 14 as shown the screenshot in FIG. 18A. FIG. 18B shows an image of the anatomical model 12 and fiducial 14 before the registration process, while FIG. 18C is a screenshot of the augmented reality application 102 once the registration process has been completed after the retracking option 125 and lock option 150 have been actuated. At block 504, the user calibrates the three- dimensional virtual space 114 by placing the medical needle 16 in the needle guide structure 104 and then verifies calibration as shown in FIGS. 19A in which the user actuates the spawn option 121 to activate and manipulate the virtual needle 115 with respect to the digital organs 124 in the three-dimensional virtual space 114 displayed on screen 112. As shown in FIG. 19C, the user actuates the define path option 129 and aligns the projected maximum needle path by superposing the medical needle 16 imaged by the camera 117 with the virtual needle 115. FIG. 2A shows the screen 112 of the smartphone 10 during the registration process showing the fiducial 14 and anatomical model 12 in the background. At block 506, the user plans the planned trajectory 108 of the medical needle 16 shown in FIGS. 20A and 20B by aligning the projected needle 118 with the digital target 132. At block 508, the user approaches the medical needle 16 to the entry point 134 along the skin of the patient or surface of the anatomical model 12 as the approximation of target depth is illustrated through graduations 123 along the planned trajectory 108 and then the user locks in the planned trajectory 108 for the medical needle 16 as shown in FIG. 20C. At block 510, as shown in FIG. 21 the user inserts the medical needle 16 into the anatomical model 12 or patient. At block 512, the user decouples the medical needle 16 from the needle guide structure 104 as illustrated in FIG. 22. At block 514, the user adjusts the planned depth 106 of the medical needle 16 within the anatomical model 12 or patient by actuating the depth slider 113 (FIG. 23B) displayed on screen 112 as shown in FIGS. 7A, 7B, and 23.
[0056] FIG. 15 shows the process flow 200 for the quick needle placement method described above in relation to workflow 500 of FIG. 12. In the pre- operational phase for constructing the three-dimensional virtual space 114, the 3D scans of an anatomy are obtained, for example computer tomography (CT) scans, at block 202. At block 204, these 3D scans are segmented and at block 206 the digital organs 124 (FIG. 2B) from the 3D scans are manually imported into the three- dimensional virtual space 114 generated by the augmented reality application 102. Once the pre-operational phase is completed, the intra-operational phase is initiated by conducting the registration process described above in which the user performs registration between the digital organs 124 in the three-dimensional virtual space 114 and the overlaid real-time image 116 of the fiducial 14 positioned proximate to the anatomical model 12 displayed on the screen 112. At block 208, the digital organs 124 are manually imported in the three-dimensional virtual space 114 and the user spawns these digital organs 124 close to the patient using the fiducial 14. At block 210, the userfinely-tunes the registration of the digital organs 124 with the fiducial 14 by using sliders 122 displayed on the screen 112 of the smartphone 10 to move and/or translate the three-dimensional virtual space 114 on the screen 112 as shown in FIG. 2B. At decision block 212, the user determines whether the registration between the fiducial 14 and the digital organs 124 is off. If the registration is off, then at decision block 214 the user determines whether the registration is severely off. If the registration is severely off, the process flow 200 returns to block 208 for the user to again spawn the digital organs 124 close to the patient using the fiducial 14. If the registration is not severely off, the process flow 200 proceeds back to block 210 for the user to manually fine-tune the registration again. At decision block 212, if the registration is not off then the process flow 200 enters the plan and guidance portion of the intra-operational phase described below.
[0057] At block 216 of the plan and guidance portion, the user defines and initiates the planned trajectory 108 of the medical needle 16 by actuating the Define Path Option 129 on screen 112 and then defining the planned trajectory 108 and maneuvering the medical needle 16 simultaneously as shown in FIG. 3A. In particular, the user places the medical needle 16 at an initial clip position within the needle guide structure 104 coupled to the smartphone 10 and then maneuvers the smartphone 10 as shown in FIG. 3A until the needle advance path displayed on the screen 112 passes through the digital target 132 in the three-dimensional virtual space 114 as shown in FIG. 3B. The user may insert the medical needle 16 1-2 cm into the patient as shown in FIG. 4 to define and display the planned trajectory 108. At decision block 218, if the registration is not off, the process flow 200 proceeds to block 222 for the user to define a planned depth 106 from the entry point 134 such that the planned depth 106 is fine-tuned by user detaching the medical needle 16 from the needle guide structure 104 (FIG. 5), moving the smartphone 10 away from the patient or anatomical model 12 and adjusting the planned depth 106 to align the tip 18 of the medical needle 16 at the center of the digital target 132 by using the displayed slider 110. At decision block 218, if the registration is off, the process flow 200 proceeds to decision block 220 to determine whether the registration is severely off. If the registration is severely off, then the process flow 200 returns to block 208 for the user to again spawn the digital organs 124 close to the patient using the fiducial 14. If the registration is not severely off, then the process flow 200 returns to block 210 for the user to register the digital organs 124 again.
[0058] Once the user completes defining the planned depth 106 at block 222, the process flow 200 proceeds to decision block 224 for the user to determine whether the registration is off. If the registration is not off, then the process flow 200 proceeds to block 228 for the user to adjust the depth of the medical needle 16. Specifically, the augmented reality application 102 displays the planned trajectory 108 to the user for further physical adjustment of the medical needle 16 in which the user further adjusts the depth of the medical needle 16 by comparing the actual depth of the medical needle 16 with the planned trajectory 108 displayed on screen 112 as shown in FIGS. 7A and 7B. At decision block 224, if the registration is off, the process flow 200 proceeds to decision block 226 for the user to determine whether the registration is severely off. If the registration is severely off, the process flow 200 proceeds back to block 208 for the user to again spawn the digital organs 124 close to the patient using the fiducial 14 and if the registration is not severely off, the process flow 200 proceeds back to block 210 for the user to manually fine-tune the registration again.
[0059] Once the user completes the adjustment of the planned trajectory 108 at block 228, the process flow 200 proceeds to decision block 230 for the user to determine whether the registration is off. If the registration is off, the process flow 200 proceeds to decision block 232 to determine whether the registration is severely off. If the registration is severely off, the process flow 200 proceeds back to block 208 and if the registration is not severely off, the process flow 200 proceeds back to block 210 for the user to manually fine-tune the registration again.
[0060] Another method for needle insertion guidance in which the user chooses the “entry point method” described in the workflow 600 shown in FIG. 13 and the process flow 300 of FIG. 16. Referring specifically to FIG. 13, at block 602, the user registers the digital organs 124 in the three-dimensional virtual space 114 with the fiducial 14. As shown in FIG. 14, the user actuates the retracking option 125 which allows for automatic retracking by the augmented reality application 102 or a manual adjustment option where the user conducts manual registration. At block 604, the user calibrates the three-dimensional space 114 by placing the medical needle 16 in the needle guide structure 18 and verifies calibration. FIG. 2A shows the screen 112 of the smartphone 10 during the registration process showing the fiducial 14 and anatomical model 12 in the background. At block 606, the user chooses the entry point 134 for the medical needle 16 as illustrated in the screenshots shown in FIGS. 24A and 24B in which the user defines the entry point 134. Once the medical needle 16 is placed at the entry point 134, the user actuates the “skin entry” option 126 to produce a blue dot 137 at the entry point 134. At block 608, the user plans the planned trajectory 108 and planned depth 106 of the medical needle 16 in the three-dimensional virtual space 114 as illustrated in the screenshots of FIGS. 25A and 25B in which the user aligns the digital target 132 (illustrated as the red dot) with the entry point 134 (illustrated as a blue dot 137) with the digital cross sign 133 to plan the planned depth 106 and planned trajectory 108 of the medical needle 16. Referring to FIGS. 26A and 26B, at block 610 the user manually aligns the medical needle 16 along the planned trajectory 108 in which visual feedback of the alignment of the medical needle 16 compared with the planned trajectory 108 is illustrated as a big red that transitions to a small green circle is shown. At block 612, the user inserts the medical needle 16 into the anatomical model 12 or patient. At block 614, the user decouples the medical needle 16 from the needle guide structure 104. At block 616, the user then adjusts the depth of the medical needle 16 within the anatomical model 12 or patient as shown in FIG. 27. [0061] Referring to FIG. 16, shows the process flow 300 for the “entry point method” of needle insertion guidance described above in relation to workflow 600 illustrated in FIG. 13. In the pre-operational phase for constructing the three- dimensional virtual space 114, 3D scans are obtained, for example CT scans at block 302. At block 304, these 3D scans are segmented and at block 306 the digital organs 124 from the 3D scans are manually imported into the three-dimensional virtual space 114 generated by the augmented reality application 102. Once the pre- operational phase is completed, the intra-operational phase is initiated by user conducting the registration process described above in which the user performs registration between the digital organs 124 and the fiducial 14 that acts as a 3D reference marker. At block 308, digital organs 124 are manually imported in the three-dimensional virtual space 114 and the user spawns these digital organs 124 close to the anatomical model 12 using the fiducial 14. At block 310, the userfine- tunes the registration of the digital organs 124 with the fiducial 14 by using sliders 122 (FIG. 2B) displayed on the screen 112 of the smartphone 10 to move and/or translate the three-dimensional virtual space 114. At decision block 312, the user determines whether the registration between the fiducial 14 and the digital organs 124 is off. If the registration is off, then at decision block 314 the user determines whether the registration is severely off. If the registration is severely off, the process flow 300 proceeds back to block 308 for the user to again spawn the digital organs 124 close to the patient using the fiducial 14. If the registration is not off, the process flow 300 returns to block 310 for the user to manually fine-tune the registration again. At decision block 312, if the registration is not off, then the process flow 300 proceeds to decision block 315 to determine whether a needle plan exists. If the needle plan does not exist, then the process flow 300 proceeds to the plan portion of the intra-operational phase at block 316. If the needle plan does exist, then the process flow 300 proceeds to the guidance portion of the intra-operational phase at block 336 as described in greater detail below.
[0062] Referring to FIGS. 8 and 16, the plan portion of process flow 300 is initiated at block 316, wherein the user defines an entry point 134 along the skin of the anatomical model 12 or patient by first placing the medical needle 16 at an initial clip position such that the medical needle 16 is inserted into the needle guide structure 104 and the tip 18 of the medical needle 16 is overlapped with the projected needle tip 119 (FIG. 19C) of the projected medical needle 118 displayed in the three-dimensional virtual space 114 on the screen 112 of the smartphone 10. Once entry point 134 is defined, the process flow 300 proceeds to decision block 318 for the user determine whether the registration is off or not. If the registration is not off, then the process flow 300 proceeds to block 322 to define a planned trajectory 108 in which the user removes the medical needle 16 from the needle guide structure 104 (FIG. 5), defines the planned trajectory 108 of the medical needle 16 by maneuvering the smartphone 10 to cause a digital cross sign 133, entry point 134 and digital target 132 displayed in the three-dimensional virtual space 114 to be in an overlapping arrangement on screen 112 as shown in FIG. 9. At decision block 324 if the registration is not off, the process flow 300 proceeds to decision block 328 where the user determines whether the planned trajectory 108 is safe or not. If the planned trajectory 108 is determined to be safe, then the process flow 300 proceeds to block 330 for the user to define a planned depth 106 of the medical needle 16 (FIG. 10) during insertion in which the planned depth 106 of the digital needle tip 119 is finetuned. In particular, the user may adjust the planned depth 106 by aligning the digital needle tip 119 at the center of the digital target 132 by using the slider 110 displayed on the screen 112. At decision block 328, if the planned trajectory is determined not to be safe, then the process flow 300 returns to block 316 for the user to again define the entry point 134.
[0063] At decision block 324, if the user determines that the registration is off, then the process flow 300 proceeds to decision block 326 for the user to determine whether the registration is severely off or not. If the registration is severely off, then the process flow 300 proceeds back to block 308 for the user to again spawn the digital organs 124 close to the anatomical model 12 or patient using the fiducial 14. If not, the process flow 300 returns to block 310 for the user to again register the digital organ 124 to the patient or anatomical model 12.
[0064] At decision block 318, if the registration is off, the process flow 300 proceeds to decision block 320 for the user to determine whether the registration is severely off or not. If the registration is severely off the process flow 300 proceeds back to block 308 for the user to again spawn the digital organs 124 close to the anatomical model 12 or patient using the fiducial 14. If the registration is not severely off, the process flow 300 proceeds back to block 310 for the user to again register the digital organs 124 to the patient or anatomical model 12. [0065] Once the planned depth 106 is defined at block 330, the process flow 300 proceeds to decision block 334 for the user to determine if the registration is off or not. If the registration is off, then the process flow 300 proceeds to decision block 332 for the user to determine if the registration is severely off or not. If the registration is severely off, the process flow 300 proceeds back to block 308 for the user to again spawn the digital organs 124 close to the patient using the fiducial 14. If the registration is not severely off, the process flow 300 proceeds back to block 310 for the user to again register the digital organs 124 to the patient or anatomical model 12.
[0066] At decision block 334, if the registration is not off, the process flow 300 proceeds to block 336 for the user to display the planned trajectory 108, the needle advance path, and the real-time discrepancy of the planned trajectory 108 and the needle path in the three-dimensional virtual space 114 as well as the initial needle insertion guidance. The needle advance path represents the planned trajectory 108 when the medical needle 16 is fully inserted into the needle guide structure 104 as shown in FIG. 11. In particular, the user aligns and inserts the medical needle 16 through the needle guide structure 104 and into the anatomical model 12 or patient based on the above needle insertion guidance. Once accomplished, the process flow 300 proceeds to decision block 338 for the user to determine whether the registration is off or not. If the registration is off, the process flow 300 proceeds to decision block 340 for the user to determine whether the registration is severely off or not. If the registration is severely off, the process flow proceeds back to block 308 for the user to again spawn the digital organs 124 close to the patient or anatomical model 12 using the fiducial 14. If the registration is not severely off, the process flow 300 proceeds back to block 310 for the user to again register the digital object 124 to the patient or anatomical model 12.
[0067] At decision block 338, if the registration is not off, the process flow 300 proceeds to block 342 for the user to display the needle insertion guidance for the medical needle 16 once the needle 16 is detached. In particular, the process flow 300 displays the planned trajectory 114 for the user to further insert and adjust the medical needle 16 in which the user detaches the medical needle 16 from the needle guide structure 104 as shown in FIG. 5, adjusts the planned trajectory 108 by comparing the real time image 116 of the medical needle 16, and the planned trajectory 108 displayed from different viewing angles on the screen 112 of the smartphone 10 as shown in FIGS. 6A and 6B. Once completed, the process flow 300 proceeds to decision block 344 for the user to determine whether the registration if off or not. If the registration is off, the process flow 300 proceeds to decision block 346 for the user to determine whether the registration is severely off or not. If the registration is severely off, the process flow 300 proceeds back to block 308 for the user to again spawn the digital organs 124 close to the patient or anatomical model 12 using the fiducial 14. If the registration is not severely off, the process flow 300 proceeds back to block 310 to again register the digital organs 124 to the patient or anatomical model 12.
[0068] Referring to FIG. 17, shows the process flow 400 for a method of needle insertion guidance that combines aspects of both the entry point method and the quick needle placement method described above. In the pre-operational phase for constructing the three-dimensional virtual space 114, the 3D scans are obtained, for example CT scans, at block 402. At block 404, these 3D scans are segmented and at block 406 the digital organs from the 3D scans manually imported into the three-dimensional virtual space 114 generated by the augmented reality application 102. Once the pre-operational phase is completed, the intra-operational phase is initiated by conducting the registration process described above in which the user performs registration between the digital organs 124 and the fiducial 14. At block 408, the digital organs 124 are manually imported in the three-dimensional virtual space 114 and the user spawn these digital organs 124 close to the patient or anatomical model 12 using a fiducial 14. At block 410, the user fine-tunes the registration of the digital organs 124 with the fiducial 14 by using sliders 122 (FIG. 2B) displayed in the screen 112 of the smartphone 10 to move and/or translate the three-dimensional virtual space 114. At decision block 412, the user determines whether the registration between the fiducial 14 and the digital organs 124 is off. If the registration is off, then at decision block 414 the user determines whether the registration is severely off. If the registration is severely off, the process flow 400 returns to block 408 for the user to again spawn the digital organs 124 close to the patient using the fiducial 14. If the registration is not severely off, the process flow 400 returns to block 410 for the user to fine-tune the registration again manually. At decision block 412, if the registration is not off then the process flow 400 enters the planned portion of the intra-operational phase. [0069] In the planned portion of process flow 400 initiated at block 416, the user defines and initiates the planned trajectory 108 by simultaneously defining the planned trajectory 108 and inserting the medical needle 16 into the patient or anatomical model 12. Specifically, the user places the medical needle 16 in the initial clip position in which the medical needle 16 is inserted into the needle guide structure 104 and the tip 18 of the medical needle 16 is then overlapped with the projected needle 118 displayed in the three-dimensional virtual space 114 on the screen 112. Once in the initial clip position, the smartphone 10 is maneuvered by the user until the needle advance path (e.g., the planned trajectory 108 when the medical needle 16 is fully inserted into the needle guide structure 104) that is displayed on screen 112 passes through the center of the digital target 132 illustrated in the three-dimensional virtual space 114. Once completed, the process flow 400 proceeds to decision block 418 for the user to determine whether the registration is severely off or not. If the registration is severely off, the process flow 400 returns to block 408 for the user to again spawn the digital organs 124 close to the patient using the fiducial 14. If the registration is not severely off, the process flow 400 returns to block 410 for the user to fine-tune the registration again manually. At decision block 412, if the registration is not off, the process flow 400 proceeds to decision block 422 for the user to determine whether the planned trajectory 108 is safe or not. If the trajectory plan 108 is determined to be safe, the process flow 400 proceeds to block 438 for the user to initiate the planned trajectory 108 and evaluate the target depth using the graduation rings 123 defined along the length of the planned depth 106 shown in the three-dimensional virtual space 114 and then the user partially inserts the medical needle 16 into the patient or anatomical model 12 in which the estimated target depth may be, for example, 1 cm.
[0070] Once the planned trajectory 108 has been defined and initiated, the process flow 400 proceeds to decision block 440 for the user to determine whether the registration is off or not. If the registration is off, the process flow proceeds to decision block 442 for the user to determine whether the registration is severely off or not. If the registration is severely off, the process flow 400 returns to block 408 for the user to again spawn the digital organs 124 close to the patient using the fiducial 14. If the registration is not severely off, the process flow 400 returns to block 410 for the user to fine-tune the registration again manually. At decision block 440, if the registration is not off, the process flow 400 proceeds to decision block 444 for the user to determine whether the planned depth 106 is defined or not. If the planned depth 106 is defined, the process flow 400 proceeds directly to block 452 for the user to adjust the planned trajectory 108 displayed in the three-dimensional virtual space 114 and to also adjust the planned depth 106 displayed in the three-dimensional virtual space 114 by comparing the overlaid realtime image 116 of the medical needle 16 with the planned trajectory 108 displayed on the screen 112. Once the adjustment step is completed, the process flow 400 proceeds to decision block 454 for the user to determine if the registration is off. If the registration is off, the process flow 400 proceeds to decision block 456 for the user to determine whether the registration is severely off. If the registration is severely off, the process flow 400 returns to block 408 for the user to again spawn the digital organs close to the patient using the fiducial 14. If the registration is not severely off, the process flow 400 returns to block 410 for the user to fine-tune the registration again manually.
[0071] At decision block 444, if the planned depth 106 is not defined, the process flow 400 proceeds to block 446 to define the planned depth 106. Specifically, the planned depth 106 is fine-tuned by detaching the medical needle 16 from the needle guide structure 104, moving the smartphone 10 away from the patient or anatomical model 12 at the center of the digital target 132 by using the slider 110 displayed on the screen 112.
[0072] Once the planned depth 106 is defined, the process flow 400 proceeds to decision block 448 for the user to determine whether the registration is off or not. If the registration is off, the process flow proceeds to decision block 450 for the user to determine whether the registration is severely off or not. If the registration is severely off, the process flow 400 returns to block 408 for the user to again spawn the digital organs 124 close to the patient or anatomical model 12 using the fiducial 14. If the registration is not severely off, the process flow 400 returns to block 410 for the user to fine-tune the registration again manually. At decision block 448, if the registration is off, the process flow 400 proceeds to block 452 for the user to adjust the planned trajectory 108 as described above.
[0073] At decision block 422, if the user is not sure that the planned trajectory 108 is safe, then the user may optionally proceed to block 424 to verify the planned trajectory 108 in the three-dimensional virtual space 114 by moving the smartphone 10 around the planned trajectory 108. Once verified, the process flow 400 proceeds to decision block 426 for the user to determine whether the verified planned trajectory 108 is safe. If the verified planned trajectory 108 is not safe, the process flow 400 proceeds back to block 416 for the user to again define and initiate the planned trajectory 108. At decision block 422, if the user is sure that that planned trajectory 108 is safe, then the process flow 400 proceeds to block 428 for the user to fine-tune the planned depth 106 of the planned trajectory 108. In particular, the user may adjust the planned depth 106 to align the projected needle tip 119 at the center of the digital target 132 by using the slider 110 displayed on screen 112. Once the planned depth 106 is fine-tuned, the process flow 400 proceeds to decision block 430 for the user to determine whether the registration is off or not. If the registration is off, the process flow proceeds to decision block 450 for the user to determine whether the registration is severely off or not. If the registration is severely off, the process flow 400 returns to block 408 for the user to again spawn the digital organs 124 close to the patient or anatomical model 12 using the fiducial 14. If the registration is not severely off, the process flow 400 returns to block 410 for the user to fine-tune the registration again manually. At decision block 430, if the registration is not off, then the process flow 400 proceeds to block 435 for the user to align the medical needle 16 with the planned trajectory 108.
[0074] At block 435, the process flow 400 displays on the screen 112 the planned trajectory 108, the needle advance plan, and the real-time discrepancy between the planned trajectory 108 and the needle advance plan in the three- dimensional virtual space 114. In addition, the initial needle insertion guidance is also displayed. In operation, the user aligns the medical needle 16 which is inserted through the needle guide structure 104 using the initial needle insertion guidance.
[0075] Once the guidance portion at block 435 is completed, the process flow 400 proceeds to decision block 434 for the user to determine whether the registration is off or not. If the registration is off, the process flow 400 proceeds to decision block 436 for the user to determine whether the registration is severely off or not. If the registration is severely off, the process flow 400 returns to block 408 for the user to again spawn the digital organs 124 close to the patient or anatomical model 12 using the fiducial 14. If the registration is not severely off, the process flow 400 returns to block 410 for the user to fine-tune the registration again manually. At decision block 434, if the registration is not off, then the process flow 400 proceeds to block 438 for the user to initiate the planned trajectory 108. Computing System
[0076] FIG. 30 is a schematic block diagram of an example device 10, such as smartphone 10, that may be used with one or more embodiments described herein, e.g., as a component of the augmented reality guidance system 100.
[0077] In some embodiments, the example device 10 comprises one or more network interfaces 710 (e.g., wired, wireless, PLC, etc.), at least one processor 720, and a memory 740 interconnected by a system bus 750, as well as a power supply 760 (e.g., battery, plug-in, etc.).
[0078] Network interface(s) 710 include the mechanical, electrical, and signaling circuitry for communicating data over the communication links coupled to a communication network. Network interfaces 710 are configured to transmit and/or receive data using a variety of different communication protocols. As illustrated, the box representing network interfaces 710 is shown for simplicity, and it is appreciated that such interfaces may represent different types of network connections such as wireless and wired (physical) connections. Network interfaces 710 are shown separately from power supply 760, however it is appreciated that the interfaces that support PLC protocols may communicate through power supply 760 and/or may be an integral component coupled to power supply 760.
[0079] Memory 740 includes a plurality of storage locations that are addressable by processor 720 and network interfaces 710 for storing software programs and data structures associated with the embodiments described herein. In some embodiments, the example device 10 may have limited memory or no memory (e.g., no memory for storage other than for programs/processes operating on the device and associated caches).
[0080] Processor 720 comprises hardware elements or logic adapted to execute the software programs (e.g., instructions) and manipulate data structures 745. An operating system 742, portions of which are typically resident in memory 740 and executed by the processor, functionally organizes example device 10 by, inter alia, invoking operations in support of software processes and/or services executing on the device. These software processes and/or services may include augmented reality application processes/services 790 described herein. Note that while augmented reality application processes/services 790 is illustrated in centralized memory 740, alternative embodiments provide for the process to be operated within the network interfaces 710, such as a component of a MAC layer, and/or as part of a distributed computing network environment.
[0081] It will be apparent to those skilled in the art that other processor and memory types, including various computer-readable media, may be used to store and execute program instructions pertaining to the techniques described herein. Also, while the description illustrates various processes, it is expressly contemplated that various processes may be embodied as modules or engines configured to operate in accordance with the techniques herein (e.g., according to the functionality of a similar process). In this context, the term module and engine may be interchangeable. In general, the term module or engine refers to model or an organization of interrelated software components/functions. Further, while the augmented reality application processes/services 790 is shown as a standalone process, those skilled in the art will appreciate that this process may be executed as a routine or module within other processes.
TESTING
Materials and Methods
Device Overview: Smartphone cover with needle guide for AR application
[0082] As shown in FIGS. 31 A and 31 B, a smartphone 10 cover for iPhone 14 Pro with an integrated needle guide structure 104 (Verza needle guide, Civco) was designed and 3D-printed (UltiMaker S3, Ultimaker B.V.) for effecting the functionalities of the augmented reality guidance system 100. This needle guide structure 104 design was chosen because of its ability to hold different sizes of medical needles 16, easily modify the needle angle, and quickly detach the medical needle 16 from the needle guide structure 104 after placement. An augmented reality application 102 for percutaneous interventions was developed on Unity. The fixed needle guide structure 104 enables the projected needle path to be virtually implemented into the augmented reality application 102. The augmented reality application 102 used a smartphone camera 117, embedded gyroscope, and IMU sensors to register the pre-operative CT scans with the target body and track the location of the smartphone 10 relative to the patient or phantom in real time. This enabled the operator to plan the needle path at the bedside of the patient and provided needle alignment feedback from multiple viewing angles, using both visual and gyroscopic confirmation. Phantom experiment
[0083] An anthropomorphic abdominal phantom (Model 057A, Sun Nuclear, Melbourne, FL) with pre-existing target lesions was used. Operators of varying levels of expertise were recruited to perform smartphone AR-assisted punctures using the augmented reality guidance system 100 followed by freehand punctures, using a 17 G needle. Six targets (liver: n=5, kidney n=1) were designated in a random order and shuffled for each operator. The median diameter of the target was 9 mm (range 7-13 mm). The entry points were pre-defined for variable complexity, to have only out-of-plane paths, with various insertion angles. The median length of the needle paths was 82 mm (range 52-99 mm).
Smartphone AR guidance
[0084] The abdominal phantom CT was segmented and modeled using an open-source software (3D slicer, https://www.slicer.org/). CT (slice thickness: 0.8 mm, voxel size: 512 x 512 x 467), (Philips IQon Spectral CT, Best, The Netherlands) was displayed and projected on the smartphone 10 through the augmented reality application 102. Registration of the 3D model on the phantom was made using a fiducial box 14 included in the preprocedural CT (FIGS. 32A and 32B), as previously described. Prior to the experiment, all operators had a training session that consisted of 1 to 4 needle placement using augmented reality guidance system 100, until the operator felt confident.
Smartphone AR workflow
[0085] Planning needle path: A fixed entry point 134 for the needle 16 was selected by the operator and recorded in the three-dimensional virtual space 114 by touching the position of the entry point 134 with the tip of the medical needle 16 (FIG. 33A). The operator then defined the needle path by aligning the entry point 134 with the target on the screen (FIG. 33B). The needle path length was predefined as the length of the physical needle by the application. The needle path depth was adjusted using a slider bar on the screen of the smartphone 10 to position the virtual needle 115 on its planned trajectory until the distal extremity reached the target (FIG. 33C). Therefore, the virtual needle 115 as a planned path was overlaid onto the phantom. [0086] Needle guidance'. The operator placed the tip of the medical needle 16 on the entry point 134 and aligned the medical needle 16 with the planned needle path, with the help of an angular feedback from the smartphone 10 gyroscope (FIG. 34A). While the medical needle 16 was aligned with the planned path, the operator inserted the medical needle 16 through the needle guide structure 104 towards the pre-defined target 132 (FIG. 34B). After detaching the medical needle 16 from the needle guide structure 104, the operator adjusted the depth of the medical needle 16 by matching the hub-needle shaft interface with the virtual needle 115 (FIG. 34C).
Freehand guidance
[0087] The operators had access to preprocedural CT with marked entry points 134 and multiplanar reconstruction to plan needle insertions. Interval CT imaging during needle placement was not permitted, to equalize radiation exposure between cohorts.
Outcomes
[0088] Planning time and puncture time were recorded. Using augmented reality guidance system 100, planning time was the time to plan the needle path using the augmented reality application 102. Puncture time was the time to align the medical needle 16 with the planned trajectory 108, insert the medical needle 16 and adjust its depth. Using freehand guidance, planning time was the time to prepare needle insertion by analyzing CT images with multiplanar reconstruction. Puncture time was the time to insert the needle in the phantom.
[0089] A post-procedural CT was performed after every three medical needle 16 insertions to avoid clutter. A target 132 was “reached” by the medical needle 16 if the medical needle 16 touched the target 132. Accuracy was evaluated by measuring the Euclidean distance, from the tip of the medical needle 16 to the center of the target 132. The angular error was also measured, and the shortest orthogonal distance from the needle shaft to the center of the target was calculated as shown in FIG. 28. Statistics
[0090] The normality of data was assessed using the Shapiro-Wilk test. Results are presented as median (Interquartile range [IQR]). Continuous variables were compared using the Wilcoxon rank sum test, categorical data were compared using Pearson’s Chi-squared test. Comparisons between operator’s experience or according to each target 132 location were performed using the Kruskal- Wallis rank sum test for continuous data and Fisher’s exact test for categorical data. All statistical tests were computed in the open-source software R (R Foundation for Statistical Computing, Vienna, Austria. URL https://www.R-project.org/)
Results
[0091] During testing, a total of 108 needle placements (54 for each guidance technique) were performed by 9 physicians with widely varying JR experience: 2 medical students, 2 residents or trainees, 3 attendings with 5 to 10 years’ experience, and 2 senior attendings with more than 20 years’ experience.
Accuracy
[0092] Using the augmented reality application 102 deployed on the smartphone 10 78 % (42/54) of the targets 132 were successfully reached by the medical needles 16, versus 24 % (13/54) (p<0.001) using freehand. The median accuracy (from the needle tip to the center of the target) was 4 mm [JQR 3-6 mm] using the smartphone 10 versus 18 mm [JQR 9-27 mm] for freehand (p<0.001). The median angular error was 2.7° [JQR 1.6-4.6°] using the smartphone 10 versus 9° [JQR 5.2-17.1 °] for freehand (p<0.001 ). The median distance from the needle shaft to the center of the target 132 was 4 mm [JQR 2-5 mm] using the smartphone 10 versus 13 mm [JQR 7-22 mm] for freehand (p<0.001). Results are detailed in Table 1 and FIG. 29.
Timing
[0093] Using the augmented reality application 102 deployed on the smartphone 10 planning time was 91 seconds [IQR 71-151 s1 , and puncture time was 68 seconds [IQR 57-77s1 . Both were slightly longer than freehand placement, which had planning time of 73 seconds ([IQR46-115s1 , p=0.01) and puncture time of 31 seconds ([IQR 20-46s1 , p<0.001).
Smartphone AR learning curve
[0094] When comparing the first set of 3 randomized needle placements to the second set of 3 randomized needle placements, using the augmented reality application 102, no difference was found in terms of planning time (first set: 104 seconds [IQR 77-147s1 versus second set 87 seconds [IQR 71 -164s1 , p=0.80), puncture time (first set: 72 seconds [IQR 57-88s1 versus second set: 67 seconds [IQR 57-73s1 , p=0.24) or accuracy (first set: 4 mm [IQR 3-7mm1 versus second set 4 mm [IQR 4-6mm1 , p=0.85).
Operator’s experience
[0095] Using augmented reality guidance system 100 deployed on the smartphone 10, no difference was found according to operator’s experience in terms of accuracy (p=0.81), planning time (p=0.69) or puncture time (p=0.06, Table 2). Using freehand, no difference was found according to operator’s experience in terms of accuracy (p=0.20), planning time (p=0.22) or puncture time (p=0.31 ).
Target location
[0096] Using augmented reality guidance system 100 deployed on the smartphone 10 differences in terms of accuracy (p=0.024) and angle error (p<0.001) were found according to the target location. Using freehand guidance, no differences in terms of accuracy (p=0.37) or angle error (p=0.63) were found according to target location (Table 3). The augmented reality application 102 error was less accurate in specific pathways than others, compared to freehand, which varied regardless of obliquity (supplemental material).
Discussion
[0097] The integration of a needle guide structure 104 to the augmented reality application 102 of the smartphone 10 facilitated workflow for needle 16 placement and allowed needle path planning directly on the application with fixed geometry on needle-phone-guide. With the needle guide structure 104, the smartphone 10 is used to give the operator real time feedback. In this constrained phantom study, the accuracy of the augmented reality guidance system 100 deployed on a smartphone 10 outperformed freehand insertion.
[0098] While integrating a needle guide structure 104 fixed on the smartphone 10, the user can hold both the medical needle 16 and smartphone 10 at the same time. The fixed path acts as a connector from the real world into the augmented reality world, enabling new possibilities and workflows with the augmented reality application 102. The needle guide structure 104 on the smartphone 10 allows planning the path/trajectory on the augmented reality application 102, at the patient’s bedside, via the smartphone’s IMU and gyroscope to track the position and orientation of the medical needle 16. The novel smartphone 10 iteration provides the operator more robust “sensors-based” real time feedback. In this phantom study, planning the path on the augmented reality application 102 was feasible and rapid, with a median time of 91 seconds, while the insertion itself took 68 seconds.
[0099] In this abdominal phantom study, the accuracy (median distance from the tip of the medical needle 16 to the center of the target 132) using augmented reality guidance system 100 was 4 mm, which was similar to the distance from the shaft of the needle 16 to the center of the target 132. Therefore, depth and angle error both contributed to the overall error. The accuracy of the augmented reality system 100 greatly outperformed freehand accuracy, which was 18 mm. Only out-of-plane needle paths were chosen, including a lateral entry point to reach a kidney target, but no difference by target location was found in terms of accuracy using freehand. Interestingly, some targets were more difficult to reach using the augmented reality guidance system 100, with significant differences in accuracy (p=0.024) and angular error (p<0.001). Those inaccuracies suggest the position of the smartphone 10 compared to the needle path could have some impact on the precision and accuracy of the guidance. Indeed, some sensors, like the smartphone gyroscope, may work better when holding the smartphone 10 in a vertical position to gravity. However, the accuracy of the augmented reality guidance system 100 was similar to previously published accuracy ranges in phantoms using both goggle-based AR guidance: 3.6-5.2 mm and smartphone-based augmented reality guidance system 100: 2.6-4 mm, while the other devices did not include path planning in the workflow. [00100] No difference was found using augmented reality guidance system 100 between the first and the second set of 3 needle placements, in terms of planning time (p=0.80), puncture time (p=0.24) or accuracy (p=0.85). Therefore, no training effect was detected after the first set of punctures, implying the short training provided before the experiment (1 to 3 needle placements, until the user felt confident) was sufficient. Finally, no difference was found according to operator’s experience in interventional radiology, confirming the potential of the augmented reality application 102 to promote standardization, reduce the influence of the user’s experience, and potentially enable less experienced to perform more like more experienced for specific needle-based tasks.
[00101] This study presents several limitations: the accuracy has been measured on a phantom, which allows reliable registration thanks to the fiducial 16, without breathing movement, and without modification of the anatomy and of the fiducial position between preprocedural CT and needle placement. Then the accuracy of the augmented reality guidance system 100 was compared to freehand guidance without intermediate CT controls, to equalize radiation exposure among cohorts, however it may not reproduce actual clinical workflows, CT gantry tilt, ultrasound use, CT fluoroscopy or step and shoot CT techniques, especially for out- of-plane paths. In this study experienced users did not obtain better accuracy than less experienced operators using freehand, which may not recapitulate reality in patients. It is hard to reproduce past conclusions whereby less experienced users were improved more than experienced, since both groups had improved performance. Finally, the augmented reality application 102 required the upload and segmentation of the 3D model of the phantom from preprocedural CT images, which can be time-consuming, without semi-automatic algorithms. Hopefully deep learningbased segmentation software will help with this task in the near future. However, further refinements and research and development are still needed to fully investigate the potential impact of smartphone AR technology in clinical practice.
Conclusion
This augmented reality application 102 deployed on a smartphone 10 with an integrated needle guide structure 104 was found to be a novel low-resource guidance tool that allows path planning at the patient’s bedside and accurate needle placement (4 mm error) with real time angular feedback in less than 3 minutes, regardless of operator experience. This technology does not require extensive training, may reduce the influence of experience on accuracy of percutaneous interventions, and provides a novel way to unite the mixed reality world with IR without goggles and without coordinating 2 hands.
[00102] It should be understood from the foregoing that, while particular embodiments have been illustrated and described, various modifications can be made thereto without departing from the spirit and scope of the invention as will be apparent to those skilled in the art. Such changes and modifications are within the scope and teachings of this invention as defined in the claims appended hereto.

Claims

CLAIMS What is claimed is:
1. A needle insertion guidance system comprising: an anatomical model, the anatomical model including an outer surface defining an internal space having representations of internal organs disposed therein; and a smartphone in visual communication with the anatomical model, the smartphone being coupled to a medical needle through a needle guide structure engaged to the smartphone, the smartphone having a camera for communicating real-time images to a display of the smartphone, the display being in operative communication with processor, wherein the processor is in communication with a memory, the memory including instructions executable by the processor to: execute, at the smartphone, an augmented reality application for providing an augmented reality representation in a three-dimensional virtual space of the anatomical model having digital organs that correspond to the internal organs of the anatomical model; generate, at the three-dimensional virtual space, a planned trajectory and a planned depth for the medical needle; overlaying, by the augmented reality application, the realtime images of the medical needle with the three- dimensional virtual space of the anatomical model; and displaying, in the three-dimensional virtual space, an orientation of the medical needle relative to a planned trajectory and planned depth within the anatomical model for providing visual feedback to a user.
2. The needle insertion guidance system of claim 1 , wherein the memory further includes instructions executable by the processor to: calibrate the three-dimensional virtual space representing the anatomical model and digital organs with the medical needle coupled to the needle guide structure engaged to the smartphone.
3. The needle insertion guidance system of claim 1 , wherein the memory further includes instructions executable by the processor to: spawn a virtual needle in the three-dimensional virtual space; and manipulate the virtual needle in the three-dimensional virtual space with respect to the digital organs displayed on the smartphone, wherein the planned trajectory of the medical needle is defined in the three-dimensional virtual space by superposing the medical needle with the virtual needle.
4. The needle insertion guidance system of claim 1 , wherein the augmented reality application is operable for providing visual feedback of the orientation and position of the medical needle relative to the planned trajectory in the three-dimensional virtual space displayed on the smartphone.
5. The needle insertion guidance system of claim 4, wherein the augmented reality application is operable for displaying the planned depth of the medical needle along the planned trajectory to a virtual target that is overlaid on realtime images of the medical needle and an entry point defined along the anatomical model.
6. The needle insertion guidance system of claim 5, wherein augmented reality application is operable for displaying the planned depth defined by a plurality of graduations denoted along the planned trajectory.
7. The needle insertion guidance system of claim 1 , wherein the needle guide structure is positioned along the smartphone such that the medical needle is visible on a camera of the smartphone to ensure visibility of the medical needle on the display of the smartphone.
8. The needle insertion guidance system of claim 1 , wherein the needle guide structure fixes in position the medical needle when coupled to the needle guide structure relative to the planned trajectory defined in the three- dimensional virtual space displayed on the smartphone.
9. The needle insertion guidance system of claim 1 , wherein the smartphone includes one or more sensors for providing real-time tracking of the position and orientation of the smartphone including the position and orientation of the medical needle coupled to the smartphone relative to the anatomical model in the three-dimensional virtual space displayed on the smartphone.
10. The needle insertion guidance system of claim 1 , wherein the augmented reality application is operable for planning the planned trajectory of the medical needle between an entry point defined along the anatomical model and a virtual target within the anatomical model in the three-dimensional virtual space displayed on the smartphone.
11 . The needle insertion guidance system of claim 1 , wherein the augmented reality application is operable for planning the planned trajectory of the medical needle by aligning a projected needle with a digital target in the three- dimensional virtual space displayed on the smartphone.
12. The needle insertion guidance system of claim 1 , further comprising: a fiducial in visual communication with the smartphone and the anatomical model, for calibrating the three-dimensional virtual space with the medical needle, wherein the augmented reality application is operable to provide registration of the actual path of the medical needle with the planned trajectory of the medical needle in the three-dimensional virtual space displayed on the smartphone.
13. The needle insertion guidance system of claim 1 , wherein the augmented reality application is operable for generating a virtual target to establish the planned trajectory and the planned depth by the medical needle.
14. The needle insertion guidance system of claim 13, wherein the display of the digital organs within the anatomical model by the augmented reality application is generated by inputting three-dimensional scans of the anatomical model to the augmented reality application and segmenting the inputted three-dimensional scans to generate the three-dimensional virtual space displayed on the smartphone.
15. A method for needle insertion guidance comprising: registering an augmented reality application operable on a smartphone relative to a target location within an anatomical model using a fiducial positioned outside the anatomical model by moving and rotating a three-dimensional virtual space through the augmented reality application and displayed on the smartphone; planning a planned trajectory of a medical needle coupled to the smartphone through a needle guide structure in the three- dimensional virtual space from an entry point defined along the anatomical model to the target location within the anatomical model in the three-dimensional virtual space; planning a planned depth of the medical needle in the three- dimensional virtual space from the entry point to the target location in the three-dimensional virtual space; and adjusting the planned depth of the medical needle along the planned trajectory relative to the target location in the three-dimensional virtual space.
16. The method of claim 15, wherein planning the planned trajectory of the medical needle comprises aligning a projected needle of the medical needle with a virtual needle generated by the augmented reality application in the three-dimensional virtual space and then aligning the virtual needle with a virtual target defined at the target location in the three-dimensional virtual space.
17. The method of claim 15, wherein planning the planned depth for insertion of the medical needle comprises aligning the digital target with the entry point with a digital cross sign of the virtual needle.
18. The method of claim 15, wherein the augmented reality application is operable for changing the color of the planned trajectory from one color to another color when aligning the medical needle with the planned trajectory.
19. The method of claim 15, wherein medical needle is decoupled from the needle guide structure once the planned trajectory and planned depth are established within the three-dimensional virtual space.
20. The method of claim 19, wherein the planned depth is adjusted by aligning a tip of the virtual needle to the virtual target after the medical needle is detached from the needle guide structure.
PCT/US2025/020993 2024-03-21 2025-03-21 Systems and methods for an augmented reality guidance system Pending WO2025199487A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202463568346P 2024-03-21 2024-03-21
US63/568,346 2024-03-21

Publications (1)

Publication Number Publication Date
WO2025199487A1 true WO2025199487A1 (en) 2025-09-25

Family

ID=95338291

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2025/020993 Pending WO2025199487A1 (en) 2024-03-21 2025-03-21 Systems and methods for an augmented reality guidance system

Country Status (1)

Country Link
WO (1) WO2025199487A1 (en)

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
DATABASE MEDLINE [online] US NATIONAL LIBRARY OF MEDICINE (NLM), BETHESDA, MD, US; 28 April 2025 (2025-04-28), SACCENTI LAETITIA ET AL: "Integrated Needle Guide on Smartphone for Percutaneous Interventions Using Augmented Reality.", XP002813373, Database accession no. NLM40295398 *
LI MING ET AL: "Smartphone- versus smartglasses-based augmented reality (AR) for percutaneous needle interventions: system accuracy and feasibility study", INTERNATIONAL JOURNAL OF COMPUTER ASSISTED RADIOLOGY AND SURGERY, SPRINGER, DE, vol. 15, no. 11, 30 July 2020 (2020-07-30), pages 1921 - 1930, XP037285937, ISSN: 1861-6410, [retrieved on 20200730], DOI: 10.1007/S11548-020-02235-7 *
SACCENTI L ET AL: "Abstract No. 203 Evaluation of a 3D-printed Needle Guide for Simplification of Smartphone Augmented Reality Application for Percutaneous Interventions", JOURNAL OF VASCULAR AND INTERVENTIONAL RADIOLOGY,20191216ELSEVIER, AMSTERDAM, NL, ELSEVIER, AMSTERDAM, NL, vol. 35, no. 3, 21 February 2024 (2024-02-21), XP087471407, ISSN: 1051-0443, [retrieved on 20240221], DOI: 10.1016/J.JVIR.2023.12.242 *
SACCENTI LAETITIA ET AL: "Integrated Needle Guide on Smartphone for Percutaneous Interventions Using Augmented Reality.", CARDIOVASCULAR AND INTERVENTIONAL RADIOLOGY 28 APR 2025, 28 April 2025 (2025-04-28), XP002813378, ISSN: 1432-086X *

Similar Documents

Publication Publication Date Title
US5765561A (en) Video-based surgical targeting system
US9586059B2 (en) User interface for guided radiation therapy
EP3733111B1 (en) Laser target projection apparatus and control method thereof, and laser surgery induction system comprising laser target projection apparatus
JP2966089B2 (en) Interactive device for local surgery inside heterogeneous tissue
CN101268967B (en) Method and apparatus for providing correction information
US6487431B1 (en) Radiographic apparatus and method for monitoring the path of a thrust needle
US12320646B2 (en) Systems, method and devices for assisting or performing guiding interventional procedures using inertial measurement units and magnetometer sensors
CN115944392A (en) Ultrasound system and method for planning ablation
US11045260B2 (en) Insertion device positioning guidance system and method
US20210353361A1 (en) Surgical planning, surgical navigation and imaging system
JP2007061438A (en) Radiotherapy apparatus positioning system and positioning method
JP2010520006A (en) Method, system and computer product for planning needle procedures
WO2007102509A1 (en) Medical image processing method
CN105555221A (en) Medical needle path display
US20230363821A1 (en) Virtual simulator for planning and executing robotic steering of a medical instrument
de Almeida et al. A neuronavigation system using a mobile augmented reality solution
Xu et al. Smartphone-guided needle angle selection during CT-guided procedures
EP3866140A1 (en) Systems and methods for simulated product training and/or experience
CN112107366A (en) Mixed reality ultrasonic navigation system
CN113693623B (en) Ultrasonic scanning guiding method and system based on augmented reality
JP2022526540A (en) Orthopedic fixation control and visualization
Cosentino et al. RAD-AR: RADiotherapy-augmented reality
WO2025199487A1 (en) Systems and methods for an augmented reality guidance system
JP2019098057A (en) Radiation therapy equipment
EP4329580B1 (en) Method and device for generating an uncertainty map for guided percutaneous procedures

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 25717835

Country of ref document: EP

Kind code of ref document: A1