[go: up one dir, main page]

WO2024165521A1 - Solution de guidage pour les procédures médicales interventionnelles - Google Patents

Solution de guidage pour les procédures médicales interventionnelles Download PDF

Info

Publication number
WO2024165521A1
WO2024165521A1 PCT/EP2024/052829 EP2024052829W WO2024165521A1 WO 2024165521 A1 WO2024165521 A1 WO 2024165521A1 EP 2024052829 W EP2024052829 W EP 2024052829W WO 2024165521 A1 WO2024165521 A1 WO 2024165521A1
Authority
WO
WIPO (PCT)
Prior art keywords
handheld apparatus
holder device
imaging system
patient
real
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/EP2024/052829
Other languages
English (en)
Inventor
Ivan Shimon FRITSCH
Yossi Cohen
Shlomo Gotman
Aviran FRANCO
Inon Berent
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Koninklijke Philips NV
Original Assignee
Koninklijke Philips NV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips NV filed Critical Koninklijke Philips NV
Publication of WO2024165521A1 publication Critical patent/WO2024165521A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/02Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/03Computed tomography [CT]
    • A61B6/032Transmission computed tomography [CT]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/12Arrangements for detecting or locating foreign bodies
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/46Arrangements for interfacing with the operator or the patient
    • A61B6/461Displaying means of special interest
    • A61B6/465Displaying means of special interest adapted to display user selection data, e.g. graphical user interface, icons or menus
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/46Arrangements for interfacing with the operator or the patient
    • A61B6/461Displaying means of special interest
    • A61B6/466Displaying means of special interest adapted to display 3D data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/46Arrangements for interfacing with the operator or the patient
    • A61B6/467Arrangements for interfacing with the operator or the patient characterised by special input means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/46Arrangements for interfacing with the operator or the patient
    • A61B6/467Arrangements for interfacing with the operator or the patient characterised by special input means
    • A61B6/469Arrangements for interfacing with the operator or the patient characterised by special input means for selecting a region of interest [ROI]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00221Electrical control of surgical instruments with wireless transmission of data, e.g. by infrared radiation or radiowaves
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/107Visualisation of planned trajectories or target regions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2048Tracking techniques using an accelerometer or inertia sensor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • A61B2034/2057Details of tracking cameras
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/376Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/376Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
    • A61B2090/3762Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy using computed tomography systems [CT]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3937Visible markers
    • A61B2090/395Visible markers with marking agent for marking skin or other tissue
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems

Definitions

  • the invention relates to the field of medical imaging and, in particular, to a method and apparatus for needle guidance in an interventional radiology imaging procedure.
  • the needle path or trajectory is usually planned on the display of an imaging system, for example a display of a Computed Tomography (CT) imaging system.
  • CT Computed Tomography
  • a user such as a radiologist, attempts to position and insert the needle according to the planned trajectory. Marking the needle entry point on a patient’s body requires either disposable accessories (e.g. a disposable grid placed on a patient’s body) that will appear in the CT image and has limited accuracy, or a complex laser or camera-based image processing system connected to the imaging system, which is expensive.
  • the trajectory is usually planned on the display of a medical imaging device (CT imaging system).
  • CT imaging system medical imaging device
  • the radiologist may try to insert the needle at a small depth and check whether the needle is in the correct direction/position by performing a scan on the patient.
  • Such a trial and error approach might expose the patient and medical personnel to unnecessary radiation, cause unnecessary pain and injury to the patient, and lengthen the duration of the procedure.
  • the object of the present invention provides techniques for marking an insertion point on a patient and providing a real-time display of an actual trajectory of an inserted apparatus relative to a planned trajectory for the inserted apparatus.
  • the techniques may be applied to virtually any interventional radiology imaging system including, but not limited to, CT, C-arm, Single Photon Emission Computer Tomography CT (SPECT- CT), Magnetic Resonance CT (MR-CT), and Positron Emission Tomography CT (PET -CT) imaging systems.
  • SPECT- CT Single Photon Emission Computer Tomography CT
  • MR-CT Magnetic Resonance CT
  • PET -CT Positron Emission Tomography CT
  • a method of performing an interventional procedure guided by an imaging device includes planning a trajectory of a handheld apparatus to be inserted in a patient, marking an insertion point for the handheld apparatus guided by an imaging device display of real-time images of the patient, placing the handheld apparatus in a holder device where the holder device communicates with a radiology imaging system via a docking station.
  • the method includes showing, on the imaging device display, the patient’s images and an orientation and direction of the handheld apparatus according to the planned trajectory along with a three-dimensional (3D) angle sensor in the holder device, inserting the handheld apparatus into the patient according to the marked insertion point, the planned trajectory, and the 3D angle of the holder device.
  • the method further includes updating continuously on the imaging device display the orientation and direction of the handheld apparatus relative to the planned trajectory and providing a real-time visual reference to guide the handheld apparatus along the planned trajectory to a desired location.
  • Other embodiments of this aspect may include corresponding computer systems, apparatus, and computer programs recorded on one or more computer storage devices, each configured to perform the actions of the methods.
  • Implementations of the method may include one or more of the following features.
  • the method where the handheld apparatus is at least one of a needle, a probe, a guidewire, or drainage catheter.
  • the method may include: controlling software of the radiology imaging system via command buttons on the holder device; and navigating the software of the radiology imaging system via navigation command buttons on the holder device.
  • the method may include generating an audio indicator and a visual display on the holder device, either alone or in combination, indicating whether the trajectory of the handheld apparatus matches the planned trajectory.
  • the method where marking an insertion point for the handheld apparatus includes overlaying an arrow on the display of real-time images generated by the camera, the arrow pointing to a planned insertion point, and marking the insertion point on the patient’s skin, the marked insertion point matching a tip of the overlay ed arrow.
  • the method further may include: performing a new imaging scan of the patient while guiding the handheld apparatus along the planned trajectory is paused, commanding the radiology imaging system to perform a reimaging scan via the holder device, determining whether the planned trajectory should be revised according to the new imaging scan, and continuing to provide the real-time visual reference to guide the handheld apparatus along the planned trajectory or a revised trajectory.
  • an interventional radiology imaging system is provided.
  • the interventional radiology imaging system including a memory configured to store a plurality of instructions and processor circuitry coupled to the memory and configured to execute the plurality of instructions to: receive a radiology imaging scan of a patient, plan a trajectory of a handheld apparatus to be inserted in a patient according to input from a user, mark an insertion point for the handheld apparatus on an interventional radiology imaging system display of real-time images of the patient, communicate between a holder device and the interventional radiology imaging system via a docking station, wherein the handheld apparatus is placed in the holder device, show on the interventional radiology imaging system display, the patient’s images and an orientation and direction of the handheld apparatus according to the planned trajectory along with a three-dimensional (3D) angle sensor in the holder device, provide a real-time visual guide to aid insertion of the handheld apparatus into the patient according to the marked insertion point, the planned trajectory, and the 3D angle of the holder device.
  • 3D three-dimensional
  • the system includes continuously updating, on the interventional radiology imaging device display, the orientation and direction of the handheld apparatus relative to the planned trajectory, and providing a real-time visual reference of the actual trajectory of the handheld apparatus relative to the planned trajectory as the handheld apparatus is guided to a desired location.
  • interventional radiology imaging system may include one or more of the following features.
  • the interventional radiology imaging system where the handheld apparatus is at least one of a needle, a probe, a guidewire, or drainage catheter.
  • the interventional radiology imaging system where the processor circuitry is further configured to execute the plurality of instructions to: receive software control commands via command buttons on the holder device, receive software navigation commands via navigation command buttons on the holder device, and generate swipe indications.
  • the processor circuitry may be further configured to execute the plurality of instructions to generate an audio indicator and a visual display on the holder device, either alone or in combination, indicating whether a real-time trajectory of the handheld apparatus corresponds with the planned trajectory.
  • the interventional radiology imaging system where the marked insertion point for the handheld apparatus on the interventional radiology imaging system display of real-time images of the patient generated by the camera is an arrow overlay ed on the display of real-time images, the arrow pointing to a planned insertion point.
  • An implementation may include marking, by an user, the insertion point on the patient’s skin, the marked insertion point matching a tip of the overlay ed arrow.
  • the interventional radiology imaging system where the processor receives the displayed real-time images of the patient from a camera connected to the interventional radiology imaging system where a scanner position of the interventional radiology imaging system is registered with the camera.
  • the processor circuitry may be further configured to execute the plurality of instructions to: perform a new imaging scan of the patient while guiding the handheld apparatus along the planned trajectory is paused; determine whether the planned trajectory should be revised according to the new imaging scan; and continue to provide the real-time visual reference to guide the handheld apparatus along the planned trajectory or a revised trajectory.
  • Implementations of the above described techniques may include hardware, a method or process, or a computer tangible medium.
  • a non-transitory computer- readable medium is provided.
  • the non-transitory computer-readable medium having stored thereon instructions for causing processing circuitry to execute a process, the process including: planning a trajectory of a handheld apparatus to be inserted in a patient, marking an insertion point for the handheld apparatus guided by an interventional imaging system display of real-time images of the patient, placing the handheld apparatus in a holder device, the holder device communicating with an interventional radiology imaging system via a docking station, showing on the interventional imaging system display, the patient’s images and an orientation and direction of the handheld apparatus according to the planned trajectory along with a three-dimensional (3D) angle sensor in the holder device, guiding the insertion of the handheld apparatus into the patient by an user according to the marked insertion point, the planned trajectory, and the 3D angle of the holder device; updating continuously on the interventional radiology imaging device display of real-time images of the patient the orientation and direction of the handheld apparatus relative to the planned trajectory, and providing a real-time visual reference to guide the handheld apparatus along the planned trajectory to a desired location.
  • 3D three-dimensional
  • Fig. 1A is a diagram of an exemplary interventional radiology imaging system according to some embodiments.
  • Fig. IB is a diagram of an exemplary C-arm interventional radiology imaging system according to some embodiments.
  • Fig. 2 illustrates a display of scan images from an imaging device and a display of realtime images of a patient from a camera according to some embodiments
  • Fig. 3 illustrates a display of an image of a patient’s scan, an orientation and direction of a handheld apparatus, and a planned trajectory;
  • Fig. 4A is a diagram of a holder device and a docking station with the holder device removed from the docking station;
  • Fig. 4B is a diagram of a holder device and a docking station with the holder placed in the docking station;
  • Fig. 4C is a diagram of a holder device in an open position inserted in a sterile bag
  • Fig. 4D is a diagram of a holder device in the closed position inserted in a sterile bag; and Fig. 5 is a flowchart of an example process for performing an interventional procedure guided by an imaging device.
  • Fig. 1 A is diagram of an exemplary interventional radiology imaging system 100.
  • the imaging system 100 includes an imaging scanning device, such as a computed tomography (CT) imaging system 102A, a patient support 104, an overhead camera 106, an examination subject 108, a docking station 110, a holder device 114, displays 118 and 122, and a handheld apparatus (needle or the like) 126.
  • CT computed tomography
  • Patient support 104 such as a couch, supports an objector examination subject such as a human patient 108.
  • Patient support 104 is configured to move the object or human patient 108 for loading, scanning, and unloading, as well as supporting the human patient 108 during an interventional radiology imaging procedure.
  • the system includes an overhead camera 106, which is coupled to the scanning device 102A, and may be configmed to provide real-time images of the patient 108 during the interventional radiology imaging procedure.
  • Camera 106 may be configured to aid a user (a radiologist) in marking and guiding the handheld apparatus 126 during the procedure, the handheld apparatus corresponding to a needle, probe, guide wire, drainage catheter, or the like.
  • Camera 106 is focused on patient 108 and patient support 104.
  • Camera 106 is connected to CT imaging system 102A and calibrated to the patient support 104 via calibration software.
  • Camera 106 generates real-time images of patient 108 and displays on at least one of displays 118 and 122 a pre-planned needle 126 insertion point as an overlay.
  • the system 100 includes a holder device (smart needle holder) 114.
  • a needle or similar device 126 may be placed in the holder device 114 during the interventional radiology imaging procedure.
  • the description refers to a needle 126.
  • any reference to needle 126 may be equally applied to at least a “probe”, a “guide wire”, and a “drainage catheter.”
  • any one of a needle, probe, guide wire, drainage catheter, or the like may be placed in the smart needle holder 114.
  • the holder device 114 may be placed in a sterile bag such that there is no direct contact between the holder device 114 and the needle, probe, guide wire, drainage catheter, etc.
  • the holder device 114 may be configmed to communicate with the CT imaging system 102A via a docking station 110.
  • the docking station 110 may be configured to communicate with the CT imaging system 102A via communication link 112.
  • the holder device 114 may be configmed to communicate with the docking station 110 via communication link 116.
  • Communication links 112 and 116 may be wired or wireless communication links.
  • a wireless communication link may be preferred, so that no wire or wires interfere with a user’s positioning and guidance of handheld apparatus 126 dming an interventional radiology imaging procedme.
  • the holder device 114 may provide the capability of drawing a real-time position of a selected needle along a planned trajectory in both coronal and sagittal planes.
  • Display 118 is a component of CT imaging system 102A and displays scans derived from CT imaging system 102 A. Display 118 may be connected to imaging device 102A via communication link 120.
  • Display 122 displays real-time images of patient 108 generated by camera 106.
  • Display 122 may receive the real-time images directly from camera via communication link 124.
  • Communication link 124 may be a direct link between display 122 and camera 106.
  • communication link 124 may a direct link between display 122 and camera 106 passing through display 118.
  • Communication link 124 may be a USB connection, Ethernet connection or any compatible computer-to-display interface, for example an HDMI, Mini HDMI, DisplayPort, display DVI, or the like to a computer console of the CT imaging system 102 A or any computer- to-display communication.
  • the camera 106 may have a minimum of 25 Megapixels (MP).
  • Camera 106 may be a fixed camera mount calibrated to image the patient support 104.
  • the camera image may cover 500-750 mm of the patient support 104 length (Z direction). This may be optionally achieved using a zoom lens on camera 106.
  • Camera 106 may identify marks on the patient support 104 on the X, Y, and Z axis, where the marks may be used as a reference position values.
  • the reference position value may be in absolute patient support 104 units on the Z axis around 0.1 mm resolution.
  • Another reference position value may be around 0.1 mm units relative to the patient support 104 tabletop center (X axis).
  • a software algorithm may translate the reference values to pixels resulting in pixel to millimeter mapping.
  • Software applications may be used to connect and display a reference point on the calibrated camera images.
  • Coordinate translation rules may be provided by a camera software package.
  • the camera software may provide information including: patient support 104 absolute position on the Z axis, X-axis position relative to patient support 104 center axis, needle 126 ID, and commands to show and remove indication of the needle 126 entry point.
  • the camera software package may be able to translate the needle entry position to X/Z coordinates and also the depth (Y axis) on an image generated by CT imaging system 102A.
  • the camera software package may include additional feature, such as a displaying an arrow or cross at a requested X/Z position, and optionally depth recognition on the camera images, as well as dynamically correcting the Y deviation with support from software of the CT imaging system 102A.
  • Fig. IB is diagram of an exemplary interventional radiology imaging system.
  • the imaging system includes an imaging scanning device such as a C-arm imaging system/device 102B, a patient support 104, an overhead camera 106, a docking station 110, a holder device 114, display 118, and handheld apparatus 126.
  • a difference between a CT imaging system and a C-arm imaging system is the manner in which the radiating elements and camera 106 are arranged.
  • the patient support 104, an overhead camera 106, an examination subject 108, a docking station 110, a holder device 114, displays 118, and 122 and a handheld apparatusl26 are essentially identical in Figs. 1A and IB. However, the arrangement and orientation may vary between Fig.
  • a C-arm imaging system/device 102B may include a computer workstation 128. Workstation 128 may be used to view, manipulate, store and transfer images.
  • Patient support 104 supports an examination subject, such as a human patient 108 (not shown in Fig. IB).
  • the C-arm imaging system/device 102B includes an overhead camera 106, which is coupled to the C-arm imaging system/device 102B and may be configmed to provide real-time images of a patient during the interventional radiology imaging procedure.
  • Camera 106 may be configured to aid a user, such as a radiologist, in marking and guiding a needle, probe, guide wire, drainage catheter, or the like during the procedure.
  • Camera 106 is connected to the C-arm imaging system/device 102B and calibrated to image the patient support 104 via calibration software.
  • Camera 106 generates real-time images of a patient and displays a pre-planned insertion point of a handheld apparatus 126 as an overlay on display 118.
  • the system further includes a holder device 114.
  • a handheld apparatus 126 may be placed in the holder device 114 during the interventional radiology imaging procedure.
  • the description refers to a handheld apparatus or needle 126.
  • any reference to handheld apparatus 126 may be equally applied to at least a “probe”, a “guide wire”, and a “drainage catheter.”
  • any one of a needle, probe, guide wire, drainage catheter, or the like may be placed in the holder device 114.
  • the holder device 114 may be placed in a sterile bag, such that there is no direct contact between the holder device 114 and the handheld apparatus (needle, probe, guide wire, drainage catheter etc.).
  • the holder device 114 may be configmed to communicate with the C-arm imaging system/device 102B via a docking station 110.
  • the docking station 110 may be configured to communicate with the C-arm imaging system/device 102B via communication link 112.
  • the holder device 114 may be configmed to communicate with the docking station 110 via communication link 116.
  • Communication links 112 and 116 may be wired or wireless communication links.
  • a wireless communication link may be preferred, so that no wire or wires interfere with a user’s positioning and guidance of handheld apparatus 126 dming an interventional radiology imaging procedme.
  • the holder device 114 may provide the capability of providing a realtime position of a selected handheld apparatus along a planned trajectory in both coronal and sagittal planes.
  • Display 118 may be a component of C-arm imaging system/device 102B and displays images derived from C-arm imaging system/device 102B. Display 118 may be connected to C-arm imaging system/device 102B via communication link 120. Display 118 may display real-time images of a patient generated by camera 106. An optional display 122 may be present in some C-arm systems, and may receive the real-time images directly from camera via communication link 124. Communication link 124 may be a direct link between display 122 and camera 106. In addition, communication link 124 may a direct link between optional display 122 and camera 106 passing through display 118.
  • Camera 106 may have a minimum of 25 Megapixels (MP). Camera 106 may be a fixed camera mount calibrated to image patient support 104. The camera image may cover 500-750 mm of the patient support 104 length (Z direction). This may be optionally achieved using a zoom lens on camera 106.
  • MP Megapixels
  • Camera 106 may identify marks on the patient support 104 on the X, Y, and Z axis , where the marks may be used as a reference position values.
  • the reference position value may be in absolute patient support 104 units on the Z axis around 0.1 mm resolution. Another reference position value may be around 0.1 mm units relative to the patient support 104 tabletop center (X axis).
  • a software algorithm may translate the reference values to pixels resulting in pixel to millimeter mapping. Software applications may be used to connect and display a reference point on the calibrated camera images. Coordinate translation rules may be provided by a camera software package.
  • the camera software may provide information including: patient support 104 absolute position on the Z axis, X axis position relative to patient support 104 center axis, handheld apparatus 126 ID, and commands to show and remove indication of the handheld apparatus 126 entry point.
  • the camera software package may be able to translate the needle entry position to X/Z coordinates and also the depth (Y axis) on an image generated by C-arm imaging system/device 102B.
  • the camera software package may include additional feature, such as displaying an arrow or cross at a requested X/Z position, and optionally depth recognition on the camera images, as well as dynamically correcting the Y deviation with support from software of the by C-arm imaging system/device 102B.
  • a CT imaging system may generally be used for diagnostic and interventional procedures.
  • a C-arm imaging system/device 102B may typically be a mobile imaging system and may primarily be used for interventional and surgical procedures.
  • Fig. 2 illustrates a display of scan images from an imaging device (imaging device 102 A or 102B) and a display of real-time images of a patient from a camera (camera 106) according to some embodiments.
  • the process of marking a needle entry point may move the image recognition task from the camera to a user, such as a radiologist. Instead of simply showing a scan image on a display of the patient, a real-time image of the patient generated from an overhead camera may also be shown.
  • An arrow 204 pointing to the needle entry point on the real-time image of the patient is displayed. A user may mark the needle entry point on the patient’s skin to match the tip of the arrow shown on the display.
  • a display (display 118) of the imaging device (imaging device 102A or 102B) an image of the patient’s scan may be displayed.
  • an orientation and direction of a holder device (holder device 114) along with a planned trajectory is shown in real-time at 202. Planned vs real-time positions may be shown in the sagittal and coronal planes.
  • the real-time orientation and direction of the holder device (holder device 114) is provided by a three-dimensional (3D) angle sensor in the holder device (holder device 114).
  • the display (display 118) of the imaging device (imaging device 102A or 102B) are patient’s images, and overlay ed on the image is an orientation and direction of the holder device to the planned trajectory along with a three- dimensional (3D) angle sensor in the holder device.
  • the patient’s scan on the display (display 118) of the imaging device (102A or 102B) may be moved to a display of real-time images of the patient (display 122) and show as a split screen. That is, one display (display 118 or display 122) may show a split screen with the patient’s scan image on one side and the real-time images of the patient on the other side.
  • the holder device (holder device 114) may interact with imaging device (imaging device 102A or 102B) interventional software using a needle trajectory guide and sensor as a workflow control device.
  • imaging device imaging device 102A or 102B
  • the integration between the holder device (holder device 114) and the imaging device (imaging device 102 A or 102B) interventional software may be enhanced by adding navigation features to the holder device (holder device 114). This enables the user or radiologist to control screens of the displays, select the needle, etc.
  • the holder device (holder device 114) may include command buttons to control the imaging device interventional software and may include navigation buttons to navigate the imaging device interventional software.
  • Fig. 3 illustrates a display of an image of a patient’s scan, an orientation and direction of a holder device, and a planned trajectory.
  • a patient’s imaging scan is shown at 302.
  • a planned trajectory is shown at 304, and a real-time image of a needle (hand-held apparatus 126) inserted into the patient is shown at 306.
  • Integration between the holder device (holder device 114) and interventional software of an imaging device (imaging device 102A or 102B) allows a user to control screens or displays in an imaging room. The user has control of the orientation and direction of the holder device (holder device 114) and guides a needle or the like (hand-held apparatus 126) held in the holder device (holder device 114) along a planned trajectory to the desired end point.
  • Fig. 4A is a diagram of a holder device and docking station, where the holder device is physically detached from the docking station.
  • a holder device 404 may be removed from the docking station 402 when performing an interventional radiology procedure.
  • Fig 4A shows holder device 404 with the needle clamp in the open position.
  • Docking station 402 is configured to communicate with an interventional radiology imaging device or imaging system via communication link 424 (shown in Fig. 4B), and holder device 404 is configured to communicate with docking station 402 via communication link 422.
  • Holder device 404 is configured with an internal battery (not shown) providing power to holder device 404.
  • Communication links 422 and 424 may be wired or wireless communication links.
  • a wireless communication link may be preferred to allow a user to manipulate holder device 404 without interference from any wires.
  • Holder device 404 is configured to communicate with the interventional radiology imaging device/system via docking station 402.
  • Docking station 402 may include a holder device locator control 410, LEDs 412, battery charge indicator 414, and charging port 416.
  • Holder device locator control 410 may be a push button or similar on/off control, and may be used to initiate a sounding beacon to aid in locating the holder device 404.
  • locator control 410 may be used to control a visual display to aid in locating the holder device 404.
  • Locator control 410 may be activated to aid the user in locating holder device 404 .
  • LEDs 412 may be used to indicate power on or off, indicate whether communication link 422 is active or not active, indicate the type of connection between holder device 404 and docking station 402, indicate the type of connection between docking station 402 and an imaging system or device (imaging device 102A or 102B), or similar information. While Figs. 4A-4D show three LEDs, it should be appreciated that more or fewer LEDs may be used without deviating from the scope of the disclosure.
  • Battery charge indicator 414 may provide a visual display of the internal battery status for holder device 404.
  • Charging port 416 may be a coupling port for holder device 404. When holder device 404 is in a closed position and placed in docking station 402, charging port 416 may be used to couple holder device 404 with docking station 402 enabling power transfer from docking station 402 to the internal battery of holder device 404 thereby charging the internal battery of holder device 404.
  • holder device locator control 410 LEDs 412, and battery charge indicator 414 are optional elements that may be implemented alone or in any combination without deviating from the scope of the disclosure.
  • Fig. 4B is a diagram of docking station 402 with holder device 404 in the closed position and placed or seated in docking station 402. Power transfer from docking station 402 to holder device 404 for charging the internal battery of holder device 404 may be performed while holder device 404 is seated in docking station 402.
  • Fig. 4B shows the holder device 404 with the needle clamp in the closed position. In the closed position, it can be seen that holder device 404 includes a display 406 and a command button 408.
  • Fig. 4C is a diagram of a holder device 404 in an open position and inserted in a sterile bag.
  • the illustration shows a left hand side 404A of the holder device and a right hand side 404B of the holder device.
  • the holder device 404 may be placed in a sterile bag 418 to avoid direct contact between the holder device 404 and needle 420 (hand-held apparatus).
  • needle 420 is placed in the open clasp at 404A after the holder device (404A and 404B) is inserted in sterile bag.
  • Placing the holder device in a sterile bag 418 ensures that there is no direct contact between the needle 420 and the holder device 404.
  • the holder device is placed in the sterile bag 418, and needle 420 is placed in the open clasp 404 A.
  • the right hand 404B may then be rotated toward the left hand side 404A to secure needle 420 in the holder device.
  • Needle 420 may be placed in the right hand side 404B, and the left hand side 404A may be rotated toward the right hand side 404B to secure needle 420.
  • Holder device 404 may be adjusted or exchanged to accommodate a different size of needle 420 or related instruments, such as a probe, a guide wire, a drainage catheter, and the like.
  • Fig. 4D is a diagram showing holder device 404 sealed in sterile bag 418 and in the closed position securing needle 420 in the clasp.
  • One side of holder device includes display 406 and command button 408.
  • display 406 and command button 408 may reside on the opposite side of the clasp 404B.
  • display 406 and command button 408 may be located on either side of 404A and 404B. Thus, in the closed position a user will have access to the display 406 and command button 408.
  • Fig. 5 is a flowchart of an example process for performing an interventional procedure guided by an imaging device.
  • a trajectory of a handheld apparatus (needle or the like) to be inserted into a patient is planned at 502.
  • an insertion point for the handheld apparatus guided by an imaging device display of real-time images of the patient generated by a camera is marked.
  • the handheld apparatus is placed in a holder device at 506.
  • the holder device is configured to communicate with a radiology imaging system via a docking station.
  • the patient’s images and an orientation and direction of the handheld apparatus according to the planned trajectory along with a three-dimensional (3D) angle sensor in the holder device is shown on a display of the imaging device.
  • 3D three-dimensional
  • an imaging device or imaging system has a display in the interventional radiology imaging room, and shown on the display is an augmented clinical image showing an orientation and direction of the handheld apparatus according to the planned trajectory along with a three-dimensional (3D) angle sensor in the holder device.
  • 3D three-dimensional
  • the handheld apparatus is inserted into the patient according to the marked insertion point, the planned trajectory, and the 3D angle of the holder device.
  • the imaging device display of the orientation and direction of the handheld apparatus relative to the planned trajectory is continuously updated at 512, and a real-time visual reference to guide the handheld apparatus along the planned trajectory to a desired location is provided at 514.
  • Operations like acquiring, determining, obtaining, outputting, providing, store or storing, calculating, simulating, receiving, warning, and stopping can be implemented as program code means of a computer program and/or as dedicated hardware.
  • a computer program may be stored and/or distributed on a suitable medium, such as an optical storage medium or a solid-state medium, supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the Internet or other wired or wireless telecommunication systems.
  • a suitable medium such as an optical storage medium or a solid-state medium, supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the Internet or other wired or wireless telecommunication systems.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Surgery (AREA)
  • Public Health (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • General Health & Medical Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Molecular Biology (AREA)
  • Biomedical Technology (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Physics & Mathematics (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Optics & Photonics (AREA)
  • Biophysics (AREA)
  • Human Computer Interaction (AREA)
  • Pulmonology (AREA)
  • Theoretical Computer Science (AREA)
  • Robotics (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

L'invention concerne un procédé et un appareil pour effectuer une procédure de radiologie interventionnelle. Une fois qu'une trajectoire planifiée est déterminée, un point d'insertion pour un appareil portatif est marqué sur un dispositif d'imagerie affichant des images en temps réel du patient, l'appareil portatif est placé dans un dispositif de support, le support communiquant avec un système d'imagerie radiologique via une station d'accueil. Les images du patient et une orientation et une direction de l'appareil portatif sont présentées sur le dispositif d'affichage de dispositif d'imagerie conjointement avec un capteur d'angle tridimensionnel (3D) dans le dispositif de support. L'appareil portatif est inséré dans le patient au niveau d'un point d'insertion correspondant marqué sur la peau du patient. Un dispositif d'affichage montrant l'orientation et la direction de l'appareil portatif par rapport à la trajectoire planifiée est mis à jour en continu, fournissant une référence visuelle en temps réel pour guider l'utilisateur lors de la réalisation de la procédure de radiologie interventionnelle.
PCT/EP2024/052829 2023-02-10 2024-02-06 Solution de guidage pour les procédures médicales interventionnelles Ceased WO2024165521A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202363444580P 2023-02-10 2023-02-10
US63/444,580 2023-02-10

Publications (1)

Publication Number Publication Date
WO2024165521A1 true WO2024165521A1 (fr) 2024-08-15

Family

ID=89898206

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2024/052829 Ceased WO2024165521A1 (fr) 2023-02-10 2024-02-06 Solution de guidage pour les procédures médicales interventionnelles

Country Status (1)

Country Link
WO (1) WO2024165521A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1103229A2 (fr) * 1999-11-26 2001-05-30 Marconi Medical Sytems Finland Inc. Système et méthode utilisant des dispositifs d'imagerie pour simplifier la planification des procédures chirurgicales
US20050203384A1 (en) * 2002-06-21 2005-09-15 Marwan Sati Computer assisted system and method for minimal invasive hip, uni knee and total knee replacement
US20230045275A1 (en) * 2021-08-05 2023-02-09 GE Precision Healthcare LLC Methods and system for guided device insertion during medical imaging

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1103229A2 (fr) * 1999-11-26 2001-05-30 Marconi Medical Sytems Finland Inc. Système et méthode utilisant des dispositifs d'imagerie pour simplifier la planification des procédures chirurgicales
US20050203384A1 (en) * 2002-06-21 2005-09-15 Marwan Sati Computer assisted system and method for minimal invasive hip, uni knee and total knee replacement
US20230045275A1 (en) * 2021-08-05 2023-02-09 GE Precision Healthcare LLC Methods and system for guided device insertion during medical imaging

Similar Documents

Publication Publication Date Title
US20200375663A1 (en) Computed tomography system
US11527002B2 (en) Registration of an image with a tracking system
JP4822634B2 (ja) 対象物の案内のための座標変換を求める方法
US10674891B2 (en) Method for assisting navigation of an endoscopic device
US6996430B1 (en) Method and system for displaying cross-sectional images of a body
RU2634296C2 (ru) Устройство для определения положения
US20090292201A1 (en) Coordinate system registration
US20220323164A1 (en) Method For Stylus And Hand Gesture Based Image Guided Surgery
CN110584782B (zh) 医学图像处理方法、装置、医学系统、计算机及存储介质
EP3175769A2 (fr) Ajout d'un capteur de poursuite à un outil rigide
EP3673854B1 (fr) Correction d'examens médicaux
EP3675039A1 (fr) Intégration d'imagerie médicale et de suivi d'emplacement
WO2024165521A1 (fr) Solution de guidage pour les procédures médicales interventionnelles
US20190142521A1 (en) Calibration of a Rigid ENT Tool
CN114533267A (zh) 一种2d图像手术定位导航系统及方法
US11132830B2 (en) Static virtual camera positioning
KR20000011134A (ko) 정위 수술 방법 및 장치
EP4611640B1 (fr) Système d'imagerie médicale
US20250339969A1 (en) Robotic calibration
WO2025027502A1 (fr) Système et procédé d'enregistrement de patient
CN117379177A (zh) 骨钉连接件、外科手术导航装置、方法及处理设备
CN115317098A (zh) 放射性粒子的植入控制方法、装置、电子设备和存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24704108

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE