[go: up one dir, main page]

CN119499562A - Method and system for label-guided image positioning - Google Patents

Method and system for label-guided image positioning Download PDF

Info

Publication number
CN119499562A
CN119499562A CN202311073708.5A CN202311073708A CN119499562A CN 119499562 A CN119499562 A CN 119499562A CN 202311073708 A CN202311073708 A CN 202311073708A CN 119499562 A CN119499562 A CN 119499562A
Authority
CN
China
Prior art keywords
tag
medical device
image
coordinate
space
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311073708.5A
Other languages
Chinese (zh)
Inventor
黄伟伦
曾永信
陈韦霖
蔡惠予
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN202311073708.5A priority Critical patent/CN119499562A/en
Publication of CN119499562A publication Critical patent/CN119499562A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61NELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
    • A61N5/00Radiation therapy
    • A61N5/10X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy
    • A61N5/1048Monitoring, verifying, controlling systems and methods
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/12Arrangements for detecting or locating foreign bodies
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61NELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
    • A61N5/00Radiation therapy
    • A61N5/10X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61NELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
    • A61N5/00Radiation therapy
    • A61N5/10X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy
    • A61N5/103Treatment planning systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61NELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
    • A61N5/00Radiation therapy
    • A61N5/10X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy
    • A61N5/103Treatment planning systems
    • A61N5/1039Treatment planning systems using functional images, e.g. PET or MRI
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61NELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
    • A61N5/00Radiation therapy
    • A61N5/10X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy
    • A61N5/1048Monitoring, verifying, controlling systems and methods
    • A61N5/1049Monitoring, verifying, controlling systems and methods for verifying the position of the patient with respect to the radiation beam
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61NELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
    • A61N5/00Radiation therapy
    • A61N5/10X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy
    • A61N2005/1092Details
    • A61N2005/1097Means for immobilizing the patient

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Veterinary Medicine (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Public Health (AREA)
  • Medical Informatics (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Optics & Photonics (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

A label guiding image positioning method comprises the steps of defining a three-dimensional space coordinate system according to label space position information obtained by identifying a reference image of a patient part provided with a reference label and position and direction data of a reference point and a direction of a radiation medical device, estimating a target coordinate representing the position of the target point in the system according to the three-dimensional medical image of the part marked with the target point and the reference mark corresponding to the reference label position and a reference coordinate representing the reference label position in the system, and outputting a positioning result as a basis for adjusting the position according to the distance between the target coordinate and a device coordinate representing the reference point position of the medical device in the judging system and the judging result of whether the direction of the medical device is respectively consistent with the distance in a radiation treatment plan of the part and the incident direction of the medical device.

Description

Method and system for positioning label guide image
Technical Field
The present disclosure relates to image positioning, and more particularly, to a method and system for positioning a label guide image.
Background
Because the uncharged neutrons cannot turn around by an electric field, such as protons, the physical mechanism of the existing boron neutron capture therapy (Boron Neutron Capture Therapy, referred to as BNCT hereinafter) system is limited, the design of the rotating beam is difficult to realize, and on the other hand, for the patient receiving BNCT, the irradiation angle and the irradiation field size of the received neutron beam are determined by the evaluation result of the therapy planning system. Therefore, prior to actual BNCT, the patient must undergo a simulated positioning to ensure that the irradiation conditions are compatible with the irradiation conditions required in the evaluation.
However, existing manual positioning methods are typically used by, for example, a medical physicist to visually locate an affected part of a patient, such as a tumor, within the beam. In the positioning process, the positions of the affected part and the characteristics of the patient body surface must be calibrated in a three-dimensional computed tomography (Computerized Tomography, CT for short) or positron emission tomography (Positron Emission Tomography, PET for short) image, and then the affected part is positioned according to the distance between the affected part and the characteristics in the image and by using a standard measuring tool and visually judging the quantified distance, but the final positioning can be completed under different dimensions and after multiple judgment and confirmation due to the positioning in the three-dimensional space. The positioning method not only can be different in quantitative accuracy in visual determination according to the vision of an operator, but also can take up to one hour in the whole positioning process, so that physical and mental burden of a patient is increased.
Therefore, how to develop a fast and precise image positioning method for BNCT or other radiotherapy has become one of the issues to be resolved in the related art.
Disclosure of Invention
An object of an embodiment of the present disclosure is to provide a method and a system for positioning a label guide image, which overcome at least one of the drawbacks of the prior art.
An embodiment of the present disclosure provides a method for positioning a label guide image, which is used for establishing and/or comparing a relationship between a specific part of a patient and a medical device in space, and is executed by a computer system, the method for positioning a label guide image includes the following steps:
(1) Receiving at least one reference image obtained by the image capturing device capturing at least one reference tag positionally related to a specific location;
(2) Identifying at least one reference label contained in at least one reference image by utilizing an image identification technology, and obtaining label space position information of the at least one reference label relative to the image shooting equipment according to the position relation of the at least one identified reference label relative to the image shooting equipment in space;
(3) Performing position calculation of the medical device and position calculation of at least one target point according to the at least one reference label position to obtain a device coordinate and at least one target coordinate, and
(4) Judging whether at least one target coordinate is adjacent to the equipment coordinate to obtain a judging result, and
(4-1) Generating and outputting a positioning result as a basis for whether to adjust the specific part or the medical equipment according to the judgment result.
In some embodiments, step (3) comprises:
(3-1) defining a three-dimensional space coordinate system according to the label space position information and the position and orientation data of the medical equipment reference point and the medical equipment direction of the medical equipment in space, so as to obtain at least one reference coordinate representing the position of at least one reference label, equipment coordinates representing the position of the medical equipment reference point and the direction representing the direction of the medical equipment in the three-dimensional space coordinate system;
(3-2) estimating at least one target coordinate representing the position of the at least one target point in the three-dimensional space coordinate system based on the three-dimensional medical image of the specific part, which is marked with the target mark representing the at least one target point, the at least one reference mark representing the position of the at least one reference mark relative to the specific part and distinguishable from the at least one reference mark, and the at least one reference coordinate in the three-dimensional space coordinate system.
In some embodiments, the number of at least one reference label is one, and when the image capturing device is fixed relative to the medical device reference point and only one reference label is attached to the specific part, the method further comprises the step of (5) acquiring position orientation data from outside, wherein the position orientation data comprises displacement data of the medical device reference point relative to the image capturing device in space and direction data of the medical device direction relative to the image capturing device in space, before the step (3-1).
In some embodiments, when the image capturing device is movable relative to the medical device reference point, at least one reference label is attached to a specific portion, and a medical device label which corresponds in position to the medical device reference point and is formed with a unique identification pattern is further disposed in the space where the patient is located, wherein:
In step (1), at least one reference image received by the computer system is obtained by capturing at least one reference tag and a medical device tag by the image capturing device, and
In the step (2), the computer system further identifies a medical device tag included in the at least one reference image, and further obtains position and orientation data according to a positional relationship of the identified medical device tag in space relative to the image capturing device.
In some embodiments, the position and orientation data includes displacement data of the medical device reference point in space relative to the image capture device and orientation data of the medical device orientation in space relative to the image capture device.
In some embodiments, the step (4) includes determining whether the estimated distance and the direction of at least one target coordinate and the device coordinate in the three-dimensional space coordinate system respectively correspond to the predetermined distance and the predetermined medical device direction included in the predetermined treatment plan to obtain a determination result, and the step (4-1) includes generating and outputting the positioning result as a basis for whether to adjust the positioning of the specific part according to the determination result.
In some embodiments, the step (4) further includes, when the determination result indicates that the estimated distance does not coincide with the predetermined distance and/or the pointing direction does not coincide with the pointing direction of the predetermined medical device, generating a positioning result by the computer system including distance difference data between the estimated distance and the predetermined distance and/or angle difference data between the pointing direction and the pointing direction of the predetermined medical device.
In some embodiments, at least one reference tag is formed with an exposed and uniquely identified pattern.
In some embodiments, the image capturing apparatus comprises at least two cameras, and before the step (1) or after the step (4), the method further comprises capturing at least one reference tag or a correction tag located in space by the at least two cameras to correct a position of the at least one reference coordinate or a correction coordinate of the correction tag in a three-dimensional space coordinate system.
Another embodiment of the present disclosure provides a system for tag-guided image localization for establishing and/or comparing a spatial relationship between a specific portion of a patient and a medical device, and the system includes at least one reference tag, an image capturing device, a memory module, and a processor. The image processing device comprises a patient, at least one reference label arranged at a position relative to a specific part, an image shooting device arranged in a space where the patient is located and configured to shoot the at least one reference label so as to obtain at least one reference image subjected to positioning processing, a storage module used for storing a guiding image positioning application program and a processor used for executing the guiding image positioning application program, wherein the guiding image positioning application program comprises the steps (1) to (4) or the steps (1) to (4-1).
In some embodiments, the image capturing device is fixedly arranged in a space where the patient is located relative to a reference point of the medical device, at least one reference tag is attached to a specific part, and the storage module further stores position and orientation data, wherein the position and orientation data comprises displacement data of the reference point of the medical device relative to the image capturing device in the space and direction data of the medical device relative to the image capturing device in the space.
In some embodiments, the system for positioning the tag-guided image further includes a display module, wherein the display module is configured with a graphical interface, and displays the distance difference data and the angle difference data via the graphical interface.
The method has the advantages that due to the fact that the reference label and the medical equipment label are used, a three-dimensional space coordinate system can be easily defined, equipment coordinates representing the position of an equipment reference point, the direction of a beam and the reference coordinates representing the position of the reference label, which are used as positioning data of a specific part, in the three-dimensional space coordinate system are obtained, at least one target coordinate representing the position of at least one target point in the three-dimensional space coordinate system can be easily and relatively accurately estimated by utilizing the three-dimensional medical image of the specific part marked with the target mark and the reference mark, and positioning results are generated and output according to the estimated distance between the at least one target coordinate and the equipment coordinate in the three-dimensional space coordinate system and the judgment result of whether the direction of the beam corresponds to the preset treatment plan of the specific part or not and the judgment result of the preset beam incidence direction respectively, and serve as the basis of whether the positioning of the specific part is adjusted or not, and therefore the positioning of the specific part can be assisted before or during the radiation treatment is conducted, and the positioning of the specific part can be rapidly matched with the preset treatment plan.
Drawings
The various aspects of the disclosure will be best understood when read in conjunction with the following detailed description. It should be noted that the various features may not be drawn to scale according to industry standard operating procedures. In fact, the dimensions of the various features may be arbitrarily increased or reduced for clarity of discussion. The foregoing and other objects, features, advantages and embodiments of the present disclosure will be apparent from the following description in which:
fig. 1 is a block diagram showing the components of a tag guidance image positioning system according to an embodiment of the present disclosure.
FIG. 2 is a block diagram showing the components of a processor according to an embodiment of the present disclosure.
Fig. 3 is a schematic diagram of a plurality of reference labels and a medical device label and their respective locations of attachment according to an embodiment of the present disclosure.
Fig. 4 is a schematic diagram of a storage module storing CT images marked with a plurality of reference marks according to an embodiment of the disclosure.
Fig. 5 is a three-dimensional CT image of a target point planned by a predetermined treatment plan for a specific portion of a patient according to an embodiment of the present disclosure.
Fig. 6 is a schematic diagram of predetermined medical device incident directions planned by a predetermined treatment plan according to an embodiment of the present disclosure.
FIG. 7 is a schematic diagram of a simulation environment of an embodiment of the present disclosure.
FIG. 8 is a flow chart illustrating a positioning procedure performed by a processor according to an embodiment of the disclosure.
Fig. 9 is a schematic diagram showing a positional relationship of an identified tag phase with respect to a lens of an image capturing device in space according to an embodiment of the disclosure.
FIG. 10 is a schematic diagram showing the difference between the distance between the target point and the predetermined treatment point in the Y-axis direction and the Z-axis direction according to an embodiment of the present disclosure.
FIG. 11 is a schematic diagram showing the difference between the distance between the target point and the predetermined treatment point in the X-axis direction and the Z-axis direction according to an embodiment of the present disclosure.
FIG. 12 is a schematic diagram showing pitch angle differences in the Y-Z plane between the medical device direction and the predetermined medical device incidence direction according to an embodiment of the present disclosure.
FIG. 13 is a schematic diagram showing a yaw angle difference between a medical device direction and a predetermined medical device incident direction in an X-Z plane according to an embodiment of the present disclosure.
Wherein reference numerals are as follows:
100 label guiding image positioning system
1 Reference tag
2 Medical device label
3 Image photographing apparatus
4 Memory module
5 Processor
51 Image identification module
52 Position relation obtaining module
53 Coordinate acquisition Module
54 Coordinate System building Module
55 Target coordinate estimation Module
56 Judgment Module
57 Positioning result generating module
6 Display module
61 Graphical interface
200 Specific part
300 Collimator
400 Therapeutic bed
500 Wall
600 Mounting rack
700 Notebook computer
CP center point
MDD medical device orientation
MDC: medical device reference Point
NV normal vector
PA pitch angle
PMDP predetermined medical device orientations
PTP predetermined point of treatment
S81 to S87 steps
TU tumor
TTP target point
YA yaw angle
X displacement
Y is displacement
Z, displacement
X is the axis
Y-axis
Z-axis
Deltax distance difference
Δy distance difference
Δz distance difference
Alpha, inclination angle
Beta, inclination angle
Gamma inclination angle
Phi, polar angle
Theta azimuth angle
Detailed Description
For the purposes of making the description of the present disclosure more thorough and complete, the following illustrative descriptions of embodiments and examples of the present disclosure are provided, but are not intended to be the only forms in which embodiments of the present disclosure may be practiced or utilized. The embodiments disclosed below may be combined with or substituted for each other as advantageous, and other embodiments may be added to one embodiment without further description or illustration. In the following description, numerous specific details are set forth in order to provide a thorough understanding of the embodiments below. However, embodiments of the disclosure may be practiced without these specific details.
In addition, spatially relative terms, such as "lower," "upper," and the like, may be used for convenience in describing the relative relationship of one element or feature to another element or feature in the drawings. These spatially relative terms are intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures. The device may be otherwise positioned (e.g., rotated 90 degrees or other orientations) and the spatially relative descriptors used herein interpreted accordingly.
In this document, the terms "a" and "an" may refer generally to one or more unless the context clearly dictates otherwise. It will be further understood that the terms "comprises," "comprising," "includes," and/or "having," when used herein, specify the presence of stated features, regions, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, regions, integers, steps, operations, elements, components, and/or groups thereof.
Referring to fig. 1, a label (tag) guided image positioning system 100 is illustratively shown for assisting in the positioning of a specific site 200 (e.g., a head) of a patient in accordance with a predetermined treatment plan of the specific site 200 prior to or during the performance of radiation treatment such as, but not limited to, BNCT. In the present embodiment, the tag guidance image positioning system 100 includes, for example, but not limited to, four reference tags 1, a medical device tag 2, an image capturing device 3, a memory module 4, a display module 6, and a processor 5.
In this embodiment, the reference labels 1 are, for example, directly attached to four different and easily distinguishable locations (e.g., the locations of the eyebrows, the nasal tips, the chin tips, and an ear tip as shown in fig. 3) of the specific portion 200, respectively. However, in other embodiments, the reference labels 1 may be attached to an extension (not shown) worn at the specific portion 200 of the head, for example, such that the attachment positions correspond to the eyebrows, the nose tip, the chin tip, and an ear tip, respectively.
In the present embodiment, the medical device tag 2 is adapted to be disposed in a space where the patient is located and is located in correspondence with a center (not shown, also referred to as beam center) of a medical device reference point MDC, for example, a beam exit (beam exit) of a BNCT device (not shown), and each of the reference tags 1 and the medical device tag 2 is formed with a unique identification pattern on an exposed surface thereof, for example, each of the reference tags 1 has a different identification pattern to facilitate the separation, as shown in fig. 3.
In the present embodiment, the image capturing device 3 is movably disposed in the space where the patient is located with respect to the reference point MDC of the medical device, and may include one or more image capturing modules (not shown, such as a CCD), and captures multiple frame reference images including the reference tags 1 and the medical device tag 2 in each positioning process.
In this embodiment, the storage module 4 is configured to store a three-dimensional medical image of the specific region, such as a CT image or a Magnetic Resonance (MRI) image, and the predetermined treatment plan. The predetermined treatment plan includes, for example, a predetermined distance between the medical device reference point MDC and the target point TTP (e.g., the center point of the tumor TU shown in fig. 5) of the specific site, and a predetermined medical device direction PMDP (pre-MEDICAL DEVICE pointing) at which the neutron beam is incident on the target point TTP (e.g., the predetermined beam incident direction is a direction defined by a polar angle phi and an azimuth angle theta shown in fig. 6). It should be noted that, in the present embodiment, the three-dimensional medical image must be marked with a target mark (marker) representing the target TTP and a plurality of distinguishable reference marks representing the attachment positions of the reference labels 1, for example, the CT image shown in fig. 4 is marked with only four reference marks corresponding to the reference labels 1 in fig. 3.
The processor 5 is electrically connected to the image capturing device 3, the storage module 4 and the display module 6, and is configured to execute a positioning procedure according to a plurality of frame reference images received from the image capturing device 3 corresponding to the current positioning process, so as to generate a positioning result corresponding to the current positioning procedure. In the present embodiment, as shown in fig. 2, the processor 5 includes, for example, an image recognition module 51, a position relationship obtaining module 52, a coordinate obtaining module 53, a coordinate system establishing module 54, a target coordinate estimating module 55, a judging module 56, and a positioning result generating module 57, and the respective operations thereof will be described in detail below in conjunction with the following related description of the positioning procedure.
In this embodiment, the display module 6 is assembled with a unique graphical interface 61 and is controlled by the processor 5, so that the display module 6 graphically displays the positioning result from the processor 5 via the graphical interface 61.
In this embodiment, the label guided image positioning system 100 may be generally used with a collimator 300, for example, in a simulation environment as shown in FIG. 7. In the example of fig. 7, a dummy representative of the patient is placed on a treatment couch 400 with adjustable position and angle, the collimator 300 is mounted on the wall 500 in a manner symmetrical to a center (not shown) of a medical device outlet (beam exit) such as beam MDC) of a BNCT device (not shown) penetrating through the wall 500 adjacent to the treatment couch 400, the image photographing device 3 is mounted on a movable mounting frame 600 in a specific upper position aligned with the medical device reference point MDC to precisely obtain a medical device direction MDD (or beam direction) of a neutron beam generated by the BNCT device and emitted through the medical device outlet, four reference labels 1 are respectively attached to positions such as an eyebrow, a nasal tip, a chin tip and an ear tip of a head of the dummy, the medical device label 2 is attached to a substrate of the collimator 300, for example, and the image photographing device 3 is mounted on a movable mounting frame 600, and the memory module 4, the display module 6 and the processor 5 are implemented in a notebook computer 700.
Referring to fig. 1, 2 and 8, it is exemplarily described in detail how the processor 5 performs the positioning procedure based on the reference images obtained in the current positioning process. The positioning procedure comprises the following steps S81-S87.
First, in step S81, the image recognition module 51 of the processor 5 recognizes each reference tag 1 and the medical device tag 2 included in each frame of reference image by using an image recognition technology.
Then, in step S82, the positional relationship acquiring module 52 of the processor 5 acquires the positional relationship of the reference tag 1 and the medical device tag 2 in space with respect to the image capturing device 3 according to the (lens of the) position of the image capturing device 3 and the images identifying the reference tag 1 and the medical device tag 2. More specifically, referring to fig. 9, for the recognized positional relationship of any one of the reference tags 1 and the medical device tag 2 with respect to the image capturing apparatus 3, for example, the positional relationship corresponding to the tag may be expressed as a vector (x, y, z, α, β, γ) including the displacement (i.e., x, y, z) of the center point CP of the tag with respect to the lens of the image capturing apparatus 3 in three axes (i.e., X, Y, Z) and the inclination (i.e., α, β, γ) of the normal vector NV (normal vector) of the tag, respectively, with respect to the three axes. In some embodiments, the tag is a rectangle, so long as the image capturing device 3 can capture any corner position (e.g., center point, upper left corner, upper right corner, lower left corner, lower right corner, etc.) on the tag, the positional relationship between the image capturing device 3 and the tag can be located.
Next, in step S83, the coordinate acquiring module 53 of the processor 5 acquires the tag space position information of the reference tags 1 and the medical device tag 2 with respect to the image capturing device 3 according to the positional relationship acquired in step S82. In the present embodiment, the tag spatial position information obtained by the coordinate obtaining module 53 includes a plurality of three-dimensional tag coordinates corresponding to the position of the image capturing device 3 and representing the positions of the reference tags 1 and the medical device tag 2 (center point thereof), respectively.
Then, in step S84, the coordinate system creation module 54 of the processor 5 defines a three-dimensional space coordinate system according to the tag space position information and the position direction data related to the medical device reference point MDC and the medical device direction MDD of the BNCT device in space, so as to obtain a plurality of reference coordinates representing the positions of the reference tags 1, a device coordinate representing the position of the medical device reference point MDC, and a pointing direction representing the medical device direction MDD in the three-dimensional space coordinate system. More specifically, in this embodiment, the coordinate system building module 54 first uses, for example, three-dimensional tag coordinates representing the location of the medical device tag 2 as a reference point, then obtains two vectors of any two reference coordinates representing the locations of two corresponding reference tags 1, respectively, relative to the reference point, and calculates the outer product of the two vectors, thereby obtaining the positional relationship of the locations of the two corresponding reference tags 1 relative to the location of the medical device tag 2. Since the location of the medical device tag 2 and the medical device reference point MDC have a corresponding relationship in terms of spatial location, the coordinate system creation module 54 directly inputs the locations of the known medical device tag 2 and the medical device reference point MDC to define the three-dimensional coordinate system with the location of the medical device reference point MDC as the origin (i.e., the device coordinates) of the coordinate system, and obtains a plurality of reference coordinates representing the locations of the reference tags 1 and the pointing direction (e.g., the direction perpendicular to and projected from the wall 500 of fig. 7) representing the medical device direction MDD through the three-dimensional coordinate system.
It should be noted that, since the image capturing device 3 is movable in the present embodiment, the location of the medical device reference point MDC must be further determined according to the correspondence between the location of the medical device tag 2 and the acquired location of the medical device tag 2. However, in other embodiments, particularly in the case where the image capturing device 3 is fixedly disposed and the positional relationship with the medical device reference point MDC is known, the medical device tag 2 may be omitted, and the storage module 4 may simply store the positional orientation data in advance, in other words, the positional orientation data includes the displacement data of the medical device reference point MDC in space relative to the image capturing device 3 and the direction data of the medical device direction MDD in space relative to the image capturing device 3, so that the image recognition module 51 need only recognize the reference tags 1 in step S81, the positional relationship obtaining module 52 need only obtain the positional relationship of the reference tags 1 relative to the image capturing device 3 in step S82, the coordinate obtaining module 53 obtains three-dimensional tag coordinates representing the position of each reference tag 1 in step S83, and the coordinate system establishing module 54 may directly define the three-dimensional tag coordinates representing the position of the reference tag 1 and the medical device in the three-dimensional coordinate system representing the origin point of the medical device in step S84, and the three-dimensional coordinate system representing the position of the medical device in the coordinate system stored in the storage module 4.
Next, in step S85, the target coordinate estimation module 55 of the processor 5 estimates target coordinates representing the position of the target point in the three-dimensional space coordinate system according to the three-dimensional medical image stored in the storage module 4 and the reference coordinates in the three-dimensional space coordinate system. More specifically, the object coordinate estimation module 55 may obtain a mark position relationship between the object mark and each reference mark according to the object mark representing the object point and the three-dimensional mark coordinates of the plurality of reference marks representing the attaching positions of the reference marks 1, respectively, in the medical image coordinate system of the three-dimensional medical image, and estimate the object coordinate in the three-dimensional space coordinate system according to the obtained mark position relationship and the reference coordinates in the three-dimensional space coordinate system.
Then, in step S86, the determining module 56 of the processor 5 determines whether the estimated distance and the direction between the target coordinate and the device coordinate in the three-dimensional coordinate system respectively match the predetermined distance and the predetermined medical device direction PMDP stored in the storage module 4, so as to generate a determination result. In this embodiment, the determination result may indicate that the estimated distance and the pointing direction respectively correspond to the predetermined distance and the predetermined medical device pointing direction PMDP, or that the estimated distance does not correspond to the predetermined distance and/or the pointing direction does not correspond to the predetermined medical device pointing direction PMDP.
Finally, in step S87, the positioning result generating module 57 of the processor 5 generates a positioning result according to the determination result, and outputs the positioning result to the display module 6, so that the display module 6 displays the positioning result via the graphical interface 61 as a basis for whether to adjust the positioning of the specific portion. More specifically, when the determination result indicates that the estimated distance does not match the predetermined distance and/or the pointing direction does not match the predetermined medical device pointing direction PMDP, the positioning result generated by the positioning result generating module 57 includes distance difference data between the estimated distance and the predetermined distance and/or angle difference data between the pointing direction and the predetermined medical device pointing direction PMDP. For example, referring to fig. 10 and 11, the distance difference data may include, for example, a distance difference Δy (see fig. 10) between the target point TTP and a predetermined treatment point PTP in a Y-axis direction (vertical direction), a distance difference Δz (see fig. 10 and 11) in a Z-axis direction (medical device exit direction), and a distance difference Δx (see fig. 11) in an X-axis direction (horizontal direction), and referring to fig. 12 and 13, the angle difference data may include, for example, a pitch angle PA (pitch angle) difference between the medical device direction MDD and the predetermined medical device direction PMDP in a Y-Z plane, and a yaw angle YA (yaw angle) difference in the X-Z plane.
According to the above example, since the positioning result includes the distance difference (i.e., Δx, Δy, Δz) and the pitch and yaw differences in three dimensions, the graphical interface 61 of the display module 6 can be properly programmed so that the graphical interface 61 properly presents the positioning result not only in the form of numerical values but also in a specific pattern and a corresponding numerical table collocated with each other to more clearly display the difference of the positioning result and the placement of the predetermined treatment plan.
Then, if the three-dimensional distance difference (i.e., Δx, Δy, Δz) and the pitch angle difference and yaw angle difference displayed by the graphical interface 61 are zero, this represents that the positioning of the specific portion 200 corresponds to the positioning of the predetermined treatment plan, otherwise, the relevant person may control the treatment couch 400 as in fig. 7 to achieve the adjustment of the specific portion 200 according to the content of the positioning result, and after the positioning of the specific portion 200 is adjusted, the processor 5 again captures the multi-frame reference images obtained by the reference tags 1 and the medical device tag 2 according to the image capturing device 3, and again executes the positioning procedure, thereby repeatedly executing the positioning procedure to achieve the positioning corresponding to the predetermined treatment plan relatively accurately and rapidly.
In summary, since the reference tag 1 and the medical device tag 2 are used, the processor 5 can easily define the three-dimensional space coordinate system and obtain the device coordinate representing the position of the medical device reference point MDC, the pointing direction representing the medical device direction MDD, and the reference coordinate representing the position of the reference tag as the positioning data of the specific part in the three-dimensional space coordinate system, and the processor 5 can easily and relatively accurately estimate the target coordinate representing the position of the target point TTP in the three-dimensional space coordinate system by using the three-dimensional medical image of the specific part which is stored in the storage module 4 and is marked with the target mark and the reference mark, and the processor 5 can generate and output the positioning result as the basis of whether to adjust the positioning of the specific part according to the estimated distance between the target coordinate and the device coordinate in the three-dimensional space coordinate system and the pointing direction MDD representing the predetermined treatment plan of the specific part and the determination result of the predetermined medical device direction PMDP, respectively, thereby assisting the specific potential to rapidly conform to the positioning of the predetermined treatment plan before or during the radiotherapy. Therefore, the label guided image positioning system 100 of the present disclosure does achieve the objectives of the present disclosure.
In some embodiments, the present disclosure provides a way of correction, which may be performed before or after performing the positioning method. The image photographing apparatus comprises at least two cameras, and the at least two cameras photograph at least one reference tag 1 or a correction tag located in a space to correct a position of at least one reference coordinate or a correction coordinate of the correction tag in a three-dimensional space coordinate system. Specifically, in the case of a plurality of cameras (for example, two cameras), a correction tag can be used as a correction. The calibration label may be any of the labels (for example, the reference label 1 or the medical device label 2) described above, and may be used as a calibration as long as a plurality of cameras commonly capture the same label.
In some embodiments, the present disclosure may be applied to other positioning applications in radiation therapy in addition to the positioning applications described above for boron neutron therapy.
In some embodiments, the present disclosure may also label the medical device tag 2 on the surgical instrument. The difference from BNCT is that it is determined whether the coordinates of the target on the patient in the three-dimensional space coordinate system are adjacent to the device coordinates of the medical device tag 2, in other words, it is not necessary to determine whether the predetermined distance and the predetermined medical device pointing direction PMDP are met. Specifically, the medical device tag 2 is provided on the scalpel, and the medical device tag 2 on the scalpel can be positioned by positioning the relative position of the patient's target coordinates (for example, tumor position). The positioning can be continuously and dynamically performed so as to master and adjust the positions of the surgical instruments and the tumor at any time.
In some embodiments, the present disclosure may also label the medical device tag 2 on the biological sampling needle cannula. The difference from BNCT is that it is determined whether the coordinates of the target on the patient in the three-dimensional space coordinate system are adjacent to the device coordinates of the medical device tag 2, in other words, it is not necessary to determine whether the predetermined distance and the predetermined medical device pointing direction PMDP are met. It is determined whether the coordinates of the object on the patient in the three-dimensional space coordinate system are adjacent to the device coordinates of the medical device tag 2. Specifically, the medical device tag 2 is provided on the needle tube, and the needle tube is positioned so that the relative position between the medical device tag 2 and the target coordinates (for example, tumor position) of the patient is located.
In some embodiments, the present disclosure may also label the medical device tag 2 on a catheter in minimally invasive surgery. The difference from BNCT is that it is determined whether the coordinates of the target on the patient in the three-dimensional space coordinate system are adjacent to the device coordinates of the medical device tag 2, in other words, it is not necessary to determine whether the predetermined distance and the predetermined medical device pointing direction PMDP are met. It is determined whether the coordinates of the object on the patient in the three-dimensional space coordinate system are adjacent to the device coordinates of the medical device tag 2. Specifically, the medical device tag 2 is provided on the catheter, and the catheter is positioned so that the relative position between the medical device tag 2 and the target coordinates (for example, tumor position) of the patient is located.
While the present disclosure has been described with reference to the embodiments, it should be understood that the invention is not limited thereto, but may be variously modified and modified by those skilled in the art without departing from the spirit and scope of the present disclosure, and the scope of the present disclosure is therefore defined by the appended claims.

Claims (19)

1. A method for locating a label-guided image, for establishing and/or comparing a relationship between a specific portion of a patient and a medical device in a space, and for executing the method using a computer system, the method comprising the steps of:
(1) Receiving at least one reference image obtained by an image photographing device photographing at least one reference tag positionally related to the specific part;
(2) Identifying the at least one reference tag contained in the at least one reference image by utilizing an image identification technology, and obtaining tag space position information of the at least one reference tag relative to the image shooting equipment according to the identified position relation of the at least one reference tag relative to the image shooting equipment in the space;
(3) Performing position calculation of the medical device and position calculation of at least one target point according to the at least one reference label position to obtain a device coordinate and at least one target coordinate, and
(4) Judging whether the at least one target coordinate is adjacent to the equipment coordinate or not to obtain a judging result.
2. The method of claim 1, wherein step (3) comprises:
(3-1) defining a three-dimensional space coordinate system based on the tag space position information and a medical device reference point of the medical device and a position orientation data of a medical device direction in the space to obtain at least one reference coordinate representing a position of the at least one reference tag in the three-dimensional space coordinate system, the device coordinate representing a position of the medical device reference point, and a direction representing the medical device direction, and
(3-2) Estimating the at least one target coordinate representing the position of the at least one target point in the three-dimensional space coordinate system based on a three-dimensional medical image of the specific portion, which is marked with a target mark representing the at least one target point and at least one reference mark representing a position of the at least one reference mark relative to the specific portion and distinguishable from the at least one reference coordinate in the three-dimensional space coordinate system.
3. The method of claim 2, wherein the at least one reference tag is one reference tag, and wherein the step (3-1) is preceded by the step of (5) obtaining the position and orientation data from the outside, the position and orientation data including a displacement data of the medical device reference point in the space relative to the image capturing device, and a direction data of the medical device direction in the space relative to the image capturing device, when the image capturing device is fixed relative to the medical device reference point and the specific part is attached with the one reference tag only.
4. The method of claim 2, wherein the specific portion is attached with the at least one reference tag when the image capturing device is movable relative to the medical device reference point, and a medical device tag is disposed in the space where the patient is located, the medical device tag corresponding in position to the medical device reference point and forming a unique identification pattern, wherein:
in step (1), the at least one reference image received by the computer system is obtained by capturing the at least one reference tag and the medical device tag by the image capturing device, and
In the step (2), the computer system further identifies the medical equipment tag contained in the at least one reference image, and further obtains the position and orientation data according to the identified positional relationship of the medical equipment tag in the space relative to the image capturing device.
5. The method of claim 4, wherein the position and orientation data includes a displacement data of the medical device reference point in the space relative to the image capturing device and a direction data of the medical device direction in the space relative to the image capturing device.
6. A tag according to claim 2 a method for guiding the positioning of an image, it is characterized in that the method comprises the steps of,
Step (4) includes determining whether the estimated distance and the direction of the at least one target coordinate and the device coordinate in the three-dimensional space coordinate system respectively correspond to a predetermined distance and a predetermined medical device direction contained in a predetermined treatment plan, so as to obtain the determination result.
7. The method of claim 6, wherein in the step (4), when the determination result indicates that the estimated distance does not match the predetermined distance and/or the pointing direction does not match the pointing direction of the predetermined medical device, the positioning result generated by the computer system includes distance difference data between the estimated distance and the predetermined distance and/or angle difference data between the pointing direction and the pointing direction of the predetermined medical device.
8. The method of claim 1, wherein the at least one reference tag is formed with an exposed and uniquely identified pattern.
9. A tag according to claim 2a method for guiding the positioning of an image,
It is characterized in that the image shooting equipment comprises at least two cameras,
Before step (1) or after step (4), the method further comprises the step of capturing at least one reference tag or a calibration tag located in the space by the at least two cameras to calibrate the position of the at least one reference coordinate or a calibration coordinate of the calibration tag in the three-dimensional space coordinate system.
10. A system for label guided image localization for establishing and/or comparing a relationship between a specific location of a patient and a medical device in a space, the system comprising:
At least one reference tag positioned in a location relative to the specific location;
The image shooting equipment is arranged in a space where the patient is located and is assembled to shoot the at least one reference label so as to obtain at least one reference image subjected to positioning processing;
A memory module for storing a guiding image positioning application program, and
A processor for executing the pilot image positioning application, the pilot image positioning application comprising the operations of:
(1) Receiving the at least one reference image obtained by the image capturing device capturing the at least one reference tag positionally related to the specific portion;
(2) Identifying the at least one reference tag contained in the at least one reference image by utilizing an image identification technology, and obtaining tag space position information of the at least one reference tag relative to the image shooting equipment according to the identified position relation of the at least one reference tag relative to the image shooting equipment in the space;
(3) Performing position calculation of the medical device and position calculation of at least one target point according to the at least one reference label position to obtain a device coordinate and at least one target coordinate, and
(4) Judging whether the at least one target coordinate is adjacent to the equipment coordinate or not to obtain a judging result.
11. The system for tag-guided image positioning of claim 10, wherein step (3) comprises:
(3-1) defining a three-dimensional space coordinate system based on the tag space position information and a medical device reference point of the medical device and a position orientation data of a medical device direction in the space to obtain at least one reference coordinate representing a position of the at least one reference tag in the three-dimensional space coordinate system, the device coordinate representing a position of the medical device reference point, and a direction representing the medical device direction, and
(3-2) Estimating the at least one target coordinate representing the position of the at least one target point in the three-dimensional space coordinate system based on a three-dimensional medical image of the specific portion, which is marked with a target mark representing the at least one target point and at least one reference mark representing a position of the at least one reference mark relative to the specific portion and distinguishable from the at least one reference coordinate in the three-dimensional space coordinate system.
12. The system for tag-guided image positioning of claim 11, wherein:
The image shooting equipment is fixedly arranged in the space where the patient is located relative to the medical equipment reference point;
The at least one reference label is attached to the specific part, and
The storage module also stores the position and orientation data, wherein the position and orientation data comprises displacement data of the medical equipment reference point relative to the image shooting equipment in the space and direction data of the medical equipment direction relative to the image shooting equipment in the space.
13. The system of claim 11, further comprising a medical device tag disposed in the space in which the patient is located and corresponding in location to the medical device reference point, the medical device tag being formed with a unique identification pattern, wherein;
in step (1), the at least one reference image received by the processor is obtained by the image capturing device capturing the at least one reference tag and the medical device tag, and
In step (2), the processor further identifies the medical device tag contained in the at least one reference image, and further obtains the position and orientation data according to the identified positional relationship of the medical device tag in the space relative to the image capturing device.
14. The system of claim 13, wherein the position and orientation data comprises a displacement data of the medical device reference point in the space relative to the image capturing device and a direction data of the medical device direction in the space relative to the image capturing device.
15. The system for guided image positioning of a tag of claim 11,
Step (4) includes determining whether an estimated distance and the direction of the at least one target coordinate and the device coordinate in the three-dimensional space coordinate system respectively correspond to a predetermined distance and a predetermined medical device direction contained in a predetermined treatment plan, so as to obtain the determination result.
16. The system of claim 15, wherein in the step (4), when the determination result indicates that the estimated distance does not match the predetermined distance and/or the pointing direction does not match the pointing direction of the predetermined medical device, the positioning result generated by the processor includes distance difference data between the estimated distance and the predetermined distance and/or angle difference data between the pointing direction and the pointing direction of the predetermined medical device.
17. The system of claim 16, further comprising a display module configured with a graphical interface, wherein the distance difference data and the angle difference data are displayed via the graphical interface.
18. The system of claim 10, wherein the at least one reference tag is formed with an exposed and uniquely identified pattern.
19. The system of claim 11, wherein the image capturing device comprises at least two cameras, wherein before step (1) or after step (4), the operation further comprises the at least two cameras capturing the at least one reference tag or a calibration tag located in the space to calibrate the position of the at least one reference coordinate or a calibration coordinate of the calibration tag in the three-dimensional space coordinate system.
CN202311073708.5A 2023-08-24 2023-08-24 Method and system for label-guided image positioning Pending CN119499562A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311073708.5A CN119499562A (en) 2023-08-24 2023-08-24 Method and system for label-guided image positioning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311073708.5A CN119499562A (en) 2023-08-24 2023-08-24 Method and system for label-guided image positioning

Publications (1)

Publication Number Publication Date
CN119499562A true CN119499562A (en) 2025-02-25

Family

ID=94645515

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311073708.5A Pending CN119499562A (en) 2023-08-24 2023-08-24 Method and system for label-guided image positioning

Country Status (1)

Country Link
CN (1) CN119499562A (en)

Similar Documents

Publication Publication Date Title
EP3711700B1 (en) System for neuronavigation registration and robotic trajectory guidance, and related methods and devices
US20190142359A1 (en) Surgical positioning system and positioning method
CN113229938B (en) Surgical robots for positioning surgery
US20250213159A1 (en) System for neuronavigation registration and robotic trajectory guidance, and related methods and devices
US10130430B2 (en) No-touch surgical navigation method and system thereof
EP2953569B1 (en) Tracking apparatus for tracking an object with respect to a body
CN111388087A (en) Surgical navigation system and computer and storage medium for performing surgical navigation method
US9715739B2 (en) Bone fragment tracking
EP3733111A1 (en) Laser target projection apparatus and control method thereof, and laser surgery induction system comprising laser target projection apparatus
US20220361959A1 (en) System and Method for Computation of Coordinate System Transformations
CN113491578B (en) Method of registering medical images to a ring-arc assembly
US12458451B2 (en) System for neuronavigation registration and robotic trajectory guidance, and related methods and devices
CN111214764B (en) A radiotherapy setup verification method and device based on a virtual intelligent medical platform
EP3733112A1 (en) System for robotic trajectory guidance for navigated biopsy needle
EP3886723B1 (en) Compensation of tracking inaccuracies
CN119499562A (en) Method and system for label-guided image positioning
JP7709778B2 (en) Method and system for tag-guided image registration - Patents.com
US20200297451A1 (en) System for robotic trajectory guidance for navigated biopsy needle, and related methods and devices
CN119486659A (en) Device, system and method for accurately positioning human head
CN111588999B (en) Operation guide model and head-wearing wearable equipment-assisted operation navigation system
HK40051864A (en) System for neuronavigation registration and robotic trajectory guidance, and related methods and devices
HK40027812B (en) System for neuronavigation registration and robotic trajectory guidance, and related methods and devices
HK40027812A (en) System for neuronavigation registration and robotic trajectory guidance, and related methods and devices
Li et al. Application and accuracy enhancement of an im-proved spatial registration method in electromag-netic navigation for tumor ablation surgery

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination