WO2005011501A1 - 医用画像診断支援装置及び方法 - Google Patents
医用画像診断支援装置及び方法 Download PDFInfo
- Publication number
- WO2005011501A1 WO2005011501A1 PCT/JP2004/010835 JP2004010835W WO2005011501A1 WO 2005011501 A1 WO2005011501 A1 WO 2005011501A1 JP 2004010835 W JP2004010835 W JP 2004010835W WO 2005011501 A1 WO2005011501 A1 WO 2005011501A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- organ
- lesion
- cross
- image
- calculating
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/46—Arrangements for interfacing with the operator or the patient
- A61B6/461—Displaying means of special interest
- A61B6/463—Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/46—Arrangements for interfacing with the operator or the patient
- A61B6/461—Displaying means of special interest
- A61B6/466—Displaying means of special interest adapted to display 3D data
Definitions
- the present invention relates to a medical image diagnostic apparatus (including an MRI apparatus, an X-ray CT apparatus, and an ultrasonic diagnostic apparatus) that supports image diagnosis using medical images obtained from a medical apparatus. More particularly, the present invention relates to a medical image diagnosis support apparatus that supports diagnosis of an organ part having a different shape depending on the condition of a subject.
- the virtual endoscope disclosed in Patent Document 1 provides an image equivalent to an endoscope using data of an X-ray CT image, and thus the endoscope is applied to a tubular organ site of a patient. Since the need for intubation is eliminated, the problem of the burden on the patient is eliminated.
- Patent Document 1 JP-A-2002-238887
- a medical image diagnosis support apparatus of the present invention includes an organ part setting unit that sets an organ part of a medical image of a subject obtained by the medical image apparatus, and an organ part setting unit that sets the organ part.
- Deformation degree calculating means for calculating the degree of deformation of the organ part obtained
- reference value storage means for storing an index of the degree of deformation of the organ part as a reference value
- a reference value stored by the reference value storage means Comparing with the degree of deformation calculated by the degree of deformation calculating means.
- a lesion discriminating means for discriminating the presence of a lesion at the organ site having the comparison result, and notifying the examiner of at least one of visual and auditory senses of the existence of the lesion at the organ site determined by the lesion discriminating device. Notification means.
- the efficiency of diagnosis can be improved.
- the reference value storage unit stores a plurality of templates according to the degree of deformation of the organ part.
- the comparison target with a plurality of templates becomes clear, and it becomes easy for the examiner to grasp the progress of the lesion.
- the deformation degree calculating means includes a cross-sectional image calculating means for calculating a cross-sectional image orthogonal to a body axis direction of the organ part, and the cross-sectional image calculating means.
- the calculated cross-sectional image power is provided with an extraction means for extracting the lumen and the outside of the organ part, and the degree of deformation of the lumen and the outside of the organ part extracted by the extraction means is calculated.
- a prominent lesion such as a tumor or stenosis appearing in the lumen of the organ site can be detected.
- the notifying unit notifies the examiner's vision by displaying the presence of the lesion determined by the determining unit in color.
- the visual notification of the examiner displays a cross-sectional image of the organ part set by the organ part setting means, and a lesion candidate determined by the lesion determining means on the cross-sectional image. Highlight the part.
- the notifying means notifies the examiner's hearing by outputting the state determined by the determining means by sound or voice.
- the examiner can auditorily identify the state by sound or voice, and thus can easily determine a lesion.
- a cross-section extracting means for extracting a cross-section of a feature quantity of a luminal organ in a tomographic image obtained by the medical image apparatus and a cross-section thereof, Means for calculating physical quantities including the radius, circularity, and center of gravity of the luminal organ in the cross section of the luminal organ extracted by the above, and a region of interest calculation for calculating a region of interest based on the physical quantity calculated by the physical quantity calculating means Means for generating a three-dimensional image of a luminal organ from a tomographic image including a luminal organ cross-section extracted by the cross-section extracting means in the region of interest calculated by the region of interest calculation means And image display means for displaying the three-dimensional image created by the three-dimensional image creation means.
- the organ region is appropriately extracted based on the set threshold value, and a more accurate three-dimensional image can be formed from the extracted organ region.
- the image processing apparatus further comprises a skeleton calculating means for calculating a skeleton of the luminal organ based on a center of gravity of a cross section of the luminal organ calculated by the physical quantity calculating means,
- the display means displays the skeleton calculated by the skeleton calculation means and the three-dimensional image created by the three-dimensional image creation means.
- the medical image diagnosis support method of the present invention includes an organ part setting step of setting an organ part of a medical image of a subject obtained by a medical imaging device, and an organ part setting step.
- a lesion determining step of comparing the degree of deformation calculated in the degree of deformation calculating step and determining the presence of a lesion in the organ part with the result of the comparison; and the presence of a lesion in the organ part determined in the lesion determining step. Notifying the examiner of at least one sense of sight and hearing.
- FIG. 1 is a block diagram showing an example of a medical image display device common to each embodiment of the present invention.
- FIG. 4 is an explanatory view of step S32 in FIG. 3.
- FIG. 5 is an explanatory view of step S33 in FIG. 3.
- FIG. 6 is an explanatory view of step S34 in FIG. 3.
- FIG. 7 is an explanatory view of step S35 in FIG. 3.
- FIG. 8 is an explanatory diagram of step S36 in FIG. 3.
- FIG. 10 is an explanatory diagram of step S92 in FIG. 9.
- FIG. 11 is an explanatory view of step S93 in FIG. 9.
- FIG. 12 is an explanatory diagram of step S93 different from FIG. 11;
- FIG. 13 is an explanatory diagram of step S95 in FIG. 9.
- FIG. 14 is a view for explaining the principle of extracting a lesion.
- FIG. 15 is a block diagram showing a detailed example of the controller in FIG. 1 for explaining the third and fourth embodiments.
- FIG. 17 is a flowchart illustrating step S165 in FIG. 16.
- FIG. 18 is a principle diagram illustrating step S165 in FIG. 16.
- FIG. 19 is a flowchart illustrating step S166 in FIG. 16.
- FIG. 20 is a principle diagram illustrating step S166 in FIG. 16.
- FIG. 22 is a diagram for explaining an example of a determination process of a lesion process.
- FIG. 23 is a view for explaining an example of the circularity processing in FIG. 22.
- FIG. 24 is a diagram showing an example of image display.
- FIG. 25 is a diagram showing a display modification of the image 241 in FIG. 24.
- FIG. 26 is a flowchart illustrating an operation example of the fifth embodiment.
- FIG. 27 is a view showing a display example of the fifth embodiment.
- FIG. 28 is a schematic diagram illustrating the operation of the sixth embodiment.
- FIG. 29 is a flowchart illustrating an operation example of the sixth embodiment.
- FIG. 30 is a diagram showing an example of a medical image display method according to the seventh embodiment.
- FIG. 31 is a flowchart illustrating the operation of the first embodiment of the present invention.
- FIG. 32 is an explanatory view of step 312 in FIG. 31.
- FIG. 33 is an explanatory view of step 313 in FIG. 31.
- FIG. 34 is an explanatory view of step 313 in FIG. 31.
- FIG. 35 is an explanatory view of step 322 in FIG. 31.
- FIG. 36 is a display example of an extraction result according to the seventh embodiment of the present invention.
- FIG. 37 is a view showing various ROI shapes.
- FIG. 38 is a view for explaining a method of setting a variable ROI setting threshold.
- FIG. 39 is a diagram showing the order in which bronchial branches and branch points are extracted according to the seventh embodiment of the present invention.
- FIG. 40 is a diagram showing the order in which bronchial branches and branch points are extracted according to the eighth embodiment of the present invention.
- FIG. 42 is an explanatory view of an eleventh embodiment of the present invention.
- FIG. 43 is an explanatory view of an eleventh embodiment of the present invention.
- FIG. 1 is a block diagram showing an example of a medical image display device common to each embodiment of the present invention.
- the medical image display apparatus includes a CPU 11, a display memory 12, a CRT 13, a main memory 14, a speed 15, a magnetic disk 16 electrically connected to the CPU 11 via a data transfer bus 1C, respectively.
- Controller 17, mouse 18, keyboard 19, local area network It has a work (LAN) 1A and a modality IB electrically connected to the LAN 1A.
- the CPU 11 controls the display memory 12, the CRT 13, the main memory 14, the speakers 15, the magnetic disk 16, the controller 17, the mouse 18, and the keyboard 19, which are components electrically connected to the data transfer bus 1C,
- the display memory 12 that controls data transmission to the LAN 1A and data reception from the LAN 1A temporarily stores image data to be displayed and output on the CRT 13.
- the CRT 13 is a display device that reads out image data stored in the display memory 12 and displays the image data.
- various displays such as a plasma and a liquid crystal, which exemplify a CRT, are also included in the display device.
- the main memory 14 stores data to be processed by the CPU 11, and stores programs to be executed by the CPU 11.
- the loudspeaker 15 has a storage unit (not shown) for storing audio data for audio output, and is capable of reading audio data stored in the storage unit to generate audio.
- the magnetic disk 16 stores data to be processed by the CPU 11 and stores a program to be executed by the CPU 11, similarly to the main memory 14, and is an external storage device having a larger storage capacity than the main memory 14.
- media such as CDs, MDs, and DVDs and RAM disks may be replaced or shared as external storage devices!
- the controller 17 has a function of measuring the shape of the organ site and determining the presence of a lesion from the measured shape of the organ site, which will be described later in detail with reference to FIG.
- the mouse 18 is used to designate an arbitrary area of the image displayed on the examiner's screen CRT12, or to select and input a button displayed on the screen of the CRT11.
- the keyboard 19 has the same functions as the mouse 18 and mainly inputs character information such as a patient ID.
- the LAN 1A is network equipment installed in a hospital, for example.
- This network facility may be a wide area network through a dedicated line or the Internet.
- Modality 1B is a medical image diagnostic device such as an X-ray device, X-ray CT device, MRI device, and ultrasound device.
- an X-ray CT apparatus is exemplified.
- Data transfer bus 1C is a standardized PCI (Peripheral Component
- USB Universal Serial
- FIG. 2 is a block diagram showing a detailed example of the controller of FIG. 1 for explaining the first and second embodiments.
- the tubular organ site is the bronchus.
- the controller 17 is electrically connected to the bronchus bifurcation detecting unit 20 electrically connected to the mouse 18, the keyboard 19, and the main memory 14, and to the bronchus bifurcation detecting unit 20 and the main memory 14.
- the branch section image creating section 21 is connected to the branch section section
- the branch section image storage section 22 is electrically connected to the branch section image creating section 21, and is electrically connected to the branch section image storage section 22. It has a distance measurement unit 23, a distance information storage unit 24 electrically connected to the distance measurement unit 23, and a normal Z abnormality determination unit 25 electrically connected to the distance information storage unit 24.
- the bronchial bifurcation detecting unit 20 is stored in a modality 18 input medical image or a bronchial extracted image obtained by extracting a bronchial region from a medical image, or an external storage device such as the main memory 14 or the magnetic disk 16. Using the medical image and the extracted image, the bifurcation of the bronchi is detected. The detection may be performed manually by an operator using an input device such as a mouse 18 or the like, and the branch coordinates, which are stored in the main memory 14 or an external storage device and determined in advance, may be used. You can use it.
- the bifurcation cross-section image creating unit 21 has a medical image stored in the main memory 14 or the like! /, Is a bronchial region extraction image force, and creates a cross-sectional image of a bronchial bifurcation.
- the bifurcation cross-section image storage unit 22 stores the bronchial bifurcation cross-section image created by the bifurcation cross-section image creation unit 21.
- the distance value measurement unit 23 measures the distance between the bronchi after branching using the bronch bifurcation cross-sectional image stored in the bifurcation cross-sectional image storage unit.
- the distance information storage unit 24 stores distance information between the bronchi after branching measured by the distance value measurement unit 23.
- the normal Z abnormality judging unit 25 judges whether the shape of the bifurcation is normal or abnormal based on the distance information between the two bronchus stored in the distance information storage unit 24 by referring to a reference value (template). And displays on the CRT13 whether it is normal or abnormal based on the judgment result of the branching part shape.
- the template refers to a normal template created from distance information based on normal cases without lesions, a mild template created from distance information based on cases with mild lesions, and moderate lesions.
- a plurality of templates are stored as a reference, such as a mediopathy template created based on the distance information based on the case to be created, and a severe template created from distance information based on the case where a severe lesion is observed. Then, the force that is close to the template is the judgment result.
- the force in the middle is obtained by interpolating the data of the normal, mild, moderate, and severe templates. For example, if there is mild or moderate disease, it will be classified as mild or moderate.
- FIG. 3 is a flowchart for explaining an operation example of the first embodiment
- FIGS. 4 to 8 are explanatory diagrams of steps S31 to S36 in FIG. 3, respectively.
- the examiner operates the mouse 18 and the like, and inputs the medical image data read from the main memory 14 to the controller 17.
- the operator operates the mouse 18 and the like, and inputs information on the bronchial bifurcation at the main memory 14 to the controller 17.
- the branch part information is the coordinates of the branch part and the direction vector of the branch part.
- the direction vector of the bifurcation is a vector indicating the running direction of the bronchus immediately before bifurcation, as indicated by arrow 41 in FIG.
- the controller 17 Based on the input medical image data and bronchial bifurcation information, the controller 17 creates several cross-sectional images orthogonal to the direction vector of the bifurcation, such as the cross-sectional image group 42 in FIG.
- the controller 17 calculates the distance between the bronchial regions in the created cross-sectional image.
- the distance may be the minimum distance between the borders of the bronchial region! The distance between them may be 55.
- the controller 17 creates a graph with respect to the cross-sectional image position of the interbronchial distance as shown in FIG. 6 using the interbronchial distance data.
- the controller 17 determines whether the shape of the bronchial bifurcation is normal or abnormal based on the created graph. If it is normal, it will have a sharp shape like curves 60 and 61 in Fig. 6. On the other hand, if there is a lesion such as a tumor, the shape is spread as shown by a curve 62. As shown in Fig. 7, a reference curve 70 (normal template) is set. If the curve 71 of the created graph is wider than the reference curve 70, it is determined to be a lesion candidate, and if the curve 71 is inside the reference curve. If it is normal.
- the area ratio r between the area of the area surrounded by the curve 71 indicating the obtained interbronchial distance and the area 72 protruding outside the area 70 is calculated. Also, a threshold T for determining the normal Z abnormality of the area ratio r is set. If the calculated area ratio r exceeds the threshold T, the shape is determined to be abnormal, and if the obtained area ratio r does not exceed the threshold T, the shape is determined to be normal.
- the reference curve and the threshold T are statistically calculated quantities obtained from a large number of clinical data.
- the threshold value T may be arbitrarily set with the input device of the mouse 18 or the keyboard 19.
- the controller 17 When it is determined that the bifurcation is deformed due to the lesion, the controller 17 highlights the bifurcation of the bronchus 81 with a circle 82 on the cross-sectional image of interest 80 as shown in FIG. Call attention. Further, in the mode of highlighting, instead of the circle 82, only the branch portion may be colored and displayed.
- a branch portion having a lesion candidate may be displayed in a different color in advance.
- a sound or sound may be output to notify the examiner when the viewpoint of the virtual endoscope passes near the bifurcation.
- both coloring and sounding may be performed.
- FIG. 10 is an explanatory diagram of step S92 in FIG. 9
- FIGS. 11 and 12 are explanatory diagrams of step S93 in FIG. 9, and
- FIG. It is an explanatory view of step S95.
- the operator operates the mouse 18 or the like, and inputs the medical image data from which the main memory 14 is also read to the controller 17.
- the operator operates the mouse 18 and the like, and inputs the information of the main memory 14 and the bronchial bifurcation to the controller 17.
- the branch part information is the coordinates of the branch part and the direction vector of the branch part.
- the controller 17 creates a cross-sectional image 101 including a direction vector 100 of the bifurcation as shown in FIG. 10 based on the input medical image data and bronchus bifurcation information.
- the controller 17 calculates a distance 112 from the reference point 111 to each point of the bronchial bifurcation in the created cross-sectional image 110.
- the distance calculated here may be the distance 122 measured perpendicular to the reference line 121 as shown in FIG.
- the controller 17 uses the interbronchial distance data to calculate a curve plotting the distance from the reference for each point of the bronchial bifurcation as shown in FIG.
- the controller 17 determines whether the shape of the bronchial bifurcation is normal or a lesion candidate based on the created curve.
- a reference curve 130 is set as shown in FIG. 13, and if the curve 131 of the created graph extends outside the reference curve 130, it is determined to be a lesion candidate, and if the curve 131 is inside the reference curve, it is determined to be normal.
- the area ratio r between the area of the area surrounded by the curve 131 indicating the obtained distance and the area 132 protruding outside the area 130 is calculated.
- a threshold value T for determining a normal Z lesion candidate having the area ratio r is set.
- the reference curve and the threshold are quantities calculated statistically.
- the threshold value T may be arbitrarily set by the input device of the mouse 18 or the keyboard 19.
- the controller 17 highlights the bronchial bifurcation with a circle on the cross-sectional image of interest to call the doctor's attention. Also, the mode of the highlighting may be such that only the branch portion is colored instead of the circle.
- a branch portion having an abnormality may be displayed in a different color in advance.
- a sound or a sound may be output to notify the examiner when the viewpoint of the virtual endoscope passes near the bifurcation.
- both coloring and sound output may be performed.
- a lesion candidate formed in a bronchial bifurcation can be found.
- FIG. 15 is a block diagram showing a detailed example of the controller of FIG. 1 for explaining the third and fourth embodiments.
- the controller 17 includes a luminal organ extracting unit 150 electrically connected to the mouse 18, the keyboard 19 and the main memory 14, and a mouse 18, the keyboard 19 and the luminal organ extracting unit 150.
- the attention area setting section 151 electrically connected, the luminal organ cross-section image creation section 152 electrically connected to the attention area setting section 151, and electrically connected to the luminal organ cross-section image creation section 152 It has a lesion candidate detection unit 153 and a lesion candidate detection result storage unit 154 electrically connected to the lesion candidate detection unit 153.
- the luminal organ extracting unit 150 extracts the luminal organ of interest from the medical image input through the main memory 14 or the like with the modality 1B.
- the attention area setting unit 151 sets a range in which the lesion candidate detection processing is performed on the luminal organ extraction result. This setting process may be performed arbitrarily by the examiner operating the mouse 18 or the like while looking at the CRT 13, or the entire extracted region may be set as the region of interest.
- the luminal organ cross-sectional image creating unit 152 creates a cross-sectional image orthogonal to the longitudinal direction blood vessel or intestinal tract of the luminal organ at each point in the extracted luminal region.
- the lesion candidate detection unit 153 performs a lesion candidate detection process on the luminal organ cross-sectional image created by the luminal organ cross-sectional image creation unit 152.
- the lesion candidate detection processing will be described later with reference to FIGS.
- the lesion candidate detection result storage unit 154 stores the coordinates of the lesion candidate detected by the above-described lesion candidate detection processing, and displays the coordinates on the CRT 13.
- FIG. 16 is a flowchart for explaining an operation example of the third embodiment
- FIGS. 17 and 18 are flowcharts and a principle diagram for explaining step S165 in FIG. 16, respectively
- FIGS. 19 and 20 are for explaining step S166 in FIG. It is a flowchart and principle diagram which perform.
- the operator operates the mouse 18 or the like, and inputs the medical image data from which the main memory 14 is also read to the controller 17.
- the controller 17 extracts the input medical image data, the trachea and the bronchial region.
- the data obtained by extracting the tracheal Z bronchial region is stored in the main memory 14 or the like in advance, the extraction process may be omitted and the stored data may be input to the controller 17.
- the operator operates the mouse 18 or the like, and sets a range in which the lesion candidate detection processing is performed in the extracted trachea Z bronchial region.
- the set range indicates a part or the whole of the extracted trachea Z bronchial region.
- the controller 17 causes the luminal organ cross-sectional image generation unit 152 to generate a cross-sectional image orthogonal to the traveling direction of the trachea Z bronchus within the set attention range.
- the controller 17 converts the cross-sectional image created by the lumen organ cross-sectional image Two-valued dangling is performed on the region.
- a pixel value of 1 is assigned to a pixel in the tracheal Z bronchial lumen region, and a pixel value of 0 is assigned to other pixels.
- the outline of the trachea Z bronchial region is an ellipse close to a circle if it is normal, and there is a protruding portion inside if there is a prominent lesion such as a tumor or stenosis Can be determined.
- the controller 17 performs a lesion candidate detection process (I) on the trachea Z bronchi cross-sectional image that has been subjected to the binarization process.
- the operator sets points at regular intervals on the periphery of the tracheal Z bronchial region.
- the controller 17 connects two points among the points set on the periphery of the trachea Z bronchial region with a line segment. Here, two adjacent points must be connected with a line segment! / ,.
- the controller 17 calculates the sum (sum) of the pixel values of each pixel on the line segment. (Step S173)
- step S170 all of the trachea Z With respect to the combination of the above two points, it is determined whether or not the force for which the determination processing of the sum of the pixel values on the line segment has been completed. If not completed, the process returns to step S171, and if completed, the area surrounded by the coordinates stored in step S174 is detected as a lesion candidate area.
- the controller 17 After the completion of the lesion candidate detection process (I), the controller 17 performs a lesion candidate detection process (II).
- the operator sets points at regular intervals on the periphery of the tracheal Z bronchial region.
- the controller 17 calculates a normal vector perpendicular to the tangent to the tracheal Z bronchial margin at each point set at the predetermined interval.
- the direction of the normal vector is directed to the internal force of the trachea Z bronchus.
- the operator sets a reference direction vector in the trachea Z bronchus section.
- the reference direction vector is determined in any direction in the trachea Z bronchial cross section.
- the controller 17 calculates an angle between the reference direction vector and each of the normal vectors.
- the angle calculation is, for example, as shown in FIG. 20, the angle between the reference direction vector 201 arbitrarily determined with respect to the bronchial section 200 and the normal vector 202 of interest with the force of the reference direction vector 201 being counterclockwise. 203.
- the angle may be calculated clockwise as a matter of course, but all normal vectors are unified to either counterclockwise or clockwise. (Step S 194)
- the controller 17 calculates a change in the angle of the measured reference direction vector force between adjacent normal vectors.
- the two normal vectors of interest both protrude into the normal part, that is, into the trachea Z bronchus. If you look at the normal vector in a counterclockwise direction, The magnitude relationship of the measured angles is (angle 206) ⁇ (angle 207), and the angle change shows an increasing tendency.
- the magnitude relationship of the measured angles is (angle 20A)> (angle 20B) as shown by the normal vectors 208 and 209, and the angle change shows a decreasing tendency. It is possible to detect a lesion candidate from the difference in the angle change.
- the process proceeds to step S196, and when the angle change indicates an increasing trend, the process proceeds to step S197.
- the controller 17 stores the coordinates of a portion where the angle change shows a decreasing tendency as a lesion candidate. If the measurement of the angle change has not been completed for all the adjacent normal vectors, the controller 17 returns to step S194. If the measurement has been completed, the controller 17 replaces the area surrounded by the coordinates stored in step S195 with the lesion candidate area. Detected as
- the controller 17 calculates the logical product of the result obtained by the lesion candidate detection process (I) and the result obtained by the lesion candidate detection process (II). By this calculation, only the region detected as a lesion candidate in both the lesion candidate detection process (I) and the lesion candidate detection process (II) remains, and the remaining region is detected as a lesion candidate region.
- the controller 17 highlights the detected lesion candidate area and displays it on the CRT 13.
- the candidate lesion area of the tracheal Z bronchi cross-sectional image may be displayed with a circle.
- an arrow or the like instead of the circle.
- only the lesion candidate area may be colored and displayed.
- the controller 17 determines whether or not the pre-lesion candidate detection processing has been completed for all the points in the attention area set in step S162. If not, the process returns to step S163.
- a trachea Z can be formed inside a luminal organ such as a bronchi, a bronchus, a blood vessel, or an intestine from a medical image such as a CT or an MR without performing observation using an endoscope or a virtual endoscope. Lesions can be detected.
- FIG. 21 is a flowchart illustrating the operation of the fourth embodiment.
- Step S160 Step S166 and Step S210—Step S216 up to this point are common, so the description of those common parts will be omitted, and only the differences will be described.
- the controller 17 calculates the logical sum of the detection result of the lesion candidate detection process (I) and the detection result of the lesion candidate detection process ( ⁇ ). By this OR operation, all the lesion candidates detected by the lesion candidate detection process (I) and the lesion candidate detection process ( ⁇ ) are obtained as detection results. (Step S218)
- the controller 17 emphasizes and displays the lesion candidate region on the CRT 13 by the detected lesion candidate detection process (I) and the lesion candidate detection process ( ⁇ ).
- a lesion candidate area portion of the cross-sectional image of the trachea Z bronchus may be circled.
- marking may be performed using arrows or the like.
- only the lesion candidate area may be displayed in color.
- the controller 17 after detecting a prominent lesion in a luminal organ, the controller 17 then determines the type of the lesion, such as polyp or stenosis, for the detected lesion candidate. May be performed. The process of determining the type of lesion will be described with reference to FIGS.
- the controller 17 calculates an outline of a lesion site with respect to a luminal organ cross-sectional image 220 having a lesion such as a polyp or a stenosis.
- the outline of the lesion site may be, for example, a line segment 223 connecting the lesion region end points 221 and 222.
- the lesion area is 224.
- the contour line 22B and the lesion region 22C of the lesion site may be obtained by performing interpolation processing such as spline interpolation using points 225, 226, 227, 228, 229, and 22A on the luminal organ contour near the lesion region.
- FIG. 22 the method of performing spline interpolation using six points has been described.
- the number of points to be used may be set arbitrarily.
- the lesion area 224 or 22C is extracted.
- the controller 17 calculates how close the extracted lesion area is to a circle using an amount called circularity.
- the circularity C is given by, for example, Expression (1) using the area S of the lesion area and the length R of the outer circumference.
- the circularity C is 1 in a perfect circle, and the value becomes smaller as the shape becomes more complicated. Therefore, a threshold T is set, and if C> T, it is determined that the lesion is a nearly circular lesion such as polyp, and if C ⁇ T, it is determined that the lesion is a lesion such as stenosis.
- the circularity described above may use the following amount in addition to the circularity.
- the controller 17 calculates the major axis 232 (length LL) for the lesion area 231 in FIG.
- the longest line segment orthogonal to the long axis 232 and connecting the edges of the lesion area 231 is defined as the short axis 233 (length Ls).
- the controller 17 obtains the ratio LLZ Ls of the calculated length of the major axis and the minor axis, and if the ratio with respect to the threshold T is the ratio LLZ Ls T, a lesion close to a circular shape such as a polyp is determined, and the ratio LLZ Ls> If T, it is determined to be a lesion such as stenosis.
- the controller 17 may perform an emphasis process according to the type of the lesion on the determined lesion site and display the same on the CRT 13. For example, the controller 17 gives a red circle for a polyp, and gives a blue circle for a stenosis, and displays it on the CRT 13. The controller 17 may, of course, indicate the lesion site with a color-coded arrow instead of the circle, or may directly color the lesion region and display it on the CRT 13.
- the logical product of the results of the lesion candidate detection processing (I) and the lesion candidate detection processing (II) is calculated. As a result, oversight of lesion candidates can be reduced.
- the logical sum of the results of the lesion candidate detection processing (I) and the lesion candidate detection processing ( ⁇ ) is calculated. This makes it possible to detect lesion candidates with high accuracy.
- the lesion candidate detection process (I) and the lesion candidate detection process (II) may be used in combination by a logical product, a logical sum, or the like as shown in the third embodiment and the fourth embodiment. , And, of course, each may be used alone. The user may use different methods according to the purpose, such as whether the user wants to increase the accuracy of lesion candidate detection or reduce oversight.
- Lesion candidate The operator can freely select the logical product or logical sum of the detection process (I) and the lesion candidate detection process (II), or either one of them, using the display options 240 in Fig. 24. .
- the controller 17 displays the result images 241 to 244 according to the lesion candidate detection processing method selected by the operator.
- the resulting image 241 is a three-dimensional image of the trachea Z bronchus that has been extracted in a rough manner.
- regions detected as lesion candidates by the above-described lesion candidate detection processing are displayed as colored lines 245-247.
- the three-dimensional image 241 can be rotated and displayed at any angle.
- the operator operates the mouse 18 or the like to select one of the lesion candidates 245-247 displayed on the three-dimensional image 241.
- the display images 242 and 243 display a virtual endoscopic image and a cross-sectional image of the selected lesion candidate site.
- the cross-sectional image 243 has been subjected to the image enhancement processing 248 described in step S168 of the third embodiment and step S218 of the fourth embodiment.
- the display / non-display of the image enhancement processing 248 can be switched.
- the operator can enlarge and display the vicinity of the candidate lesion in the cross-sectional image 243.
- the enlarged image may be displayed as it is at the position of the cross-sectional image 243, or may be displayed in the image display area 244.
- the currently displayed virtual endoscope image 242 (referred to as a forward virtual endoscope image) and the reverse of the viewpoint rotated by 180 degrees are displayed. It is possible to display the virtual endoscope image in the direction.
- the reverse virtual endoscope image may be displayed at the position of the forward virtual endoscope image 242, or may be displayed in the image display area 244.
- the lesion candidate detection result is displayed in order by operating the scroll bar 24C. You may do so.
- the numerical value displayed in the numerical display area at the top of the scroll bar indicates the number of the currently displayed lesion candidate. Even if the number corresponding to the numerical value displayed in the numerical value display area is displayed in advance at the lesion candidate position on the 3D image 241 However, the lesion candidate position 245 corresponding to the numerical value currently displayed in the numerical display area may be represented by a different color line from the other lesion candidates 246 and 247.
- the operator operates the scroll bar 24D and the scroll bar 24E by operating the mouse 18 and the like, and can set the window level and the window width of the sectional image 243.
- the set window level and window width values are displayed in the numerical display area above each scroll bar.
- a section 245, a section 246, and a section 247 may be displayed as shown in FIG.
- a trachea Z can be formed inside a luminal organ such as a bronchus, a blood vessel, or an intestine from a medical image such as a CT or MR without performing observation using an endoscope or a virtual endoscope. Lesions can be detected. This has the effect of reducing the burden on the physician and, at the same time, reducing oversight of the lesion.
- the examiner When using a virtual endoscope to check the position and size of a lesion with respect to a lesion candidate detected by any of the first to fourth embodiments, the examiner is notified of the presence or absence of the lesion. Is also good.
- the controller 17 colors the lesion area on the displayed virtual endoscope image.
- the examiner turns the viewpoint toward the lesion immediately before and after the viewpoint passes through the lesion portion, and changes the viewpoint direction after passing through the lesion. Undo.
- FIG. 26 is a flowchart showing the operation of the present embodiment. Each step will be described below.
- a lesion area is detected by using any one of the lesion detection methods described in the first to fourth embodiments, and information on their positions is stored.
- the viewpoint is advanced while creating a virtual endoscopic image by virtual endoscopy using the stored position information.
- the operator initializes the viewpoint position of the virtual endoscope.
- the initialization is the setting of the start position when performing observation with a virtual endoscope.
- the start position is set, for example, by the operator using an input device such as a mouse 18 and clicking one point in a luminal organ to be observed on a tomographic image such as a CT or MR.
- the CPU 11 creates a virtual endoscope image.
- the CPU 11 determines whether or not there is a lesion area in the created virtual endoscope image, using the information on the lesion that has been roughly detected by the above-described lesion detection method. If a lesion area exists, the process proceeds to step S263. If not, the process proceeds to step S268.
- step S261 the CPU 11 colors the lesion area on the virtual endoscopic image created in step S261. For example, coloring is performed as shown in a region 270 in FIG.
- the CPU 11 measures a distance L between the viewpoint position and the lesion position.
- the CPU 11 preliminarily sets the measured distance value L and compares it with the threshold value T. If L ⁇ T, that is, if the viewpoint is at a position close to the lesion, the process proceeds to step S265. If L> T, that is, if the viewpoint is at a position far from the lesion, the process proceeds to step S268.
- the CPU 11 changes the direction of the viewpoint to the direction in which the lesion exists.
- the CPU 11 creates a virtual endoscope image in the changed viewpoint direction.
- the CPU 11 displays the created virtual endoscope image on the CRT 13 using the display memory 12.
- Step S269) The operator updates the viewpoint of the virtual endoscope. For example, the viewpoint of the virtual endoscope moves forward when the left button of the mouse is kept pressed, and the viewpoint moves backward when the right button is kept pressed. Move the mouse to change the direction of the viewpoint. This makes it possible to update the viewpoint.
- step S261 After updating the viewpoint, the CPU 11 returns to step S261 again, creates a virtual endoscopic image for notification of the presence of a lesion, and displays the created virtual endoscopic image on the CRT 13.
- the processing for coloring the lesion area and moving the viewpoint toward the lesion when passing through the lesion has been described. If the threshold T is shortened, the time required for the viewpoint to move toward the lesion is shortened, and the operator can detect the lesion without overlooking from a quick change in the viewpoint direction.
- the entire screen may be flashed for a moment, or the viewpoint update time may be purposely slowed down or discontinuously updated when passing through the lesion area.
- the operator may be notified of the presence of a lesion using a change in the virtual endoscope display image.
- the distance between the viewpoint and the lesion instead of using the distance between the viewpoint and the lesion to determine whether or not to give a visual notification, if the lesion occupies n% of the field of view of the virtual endoscope, change the viewpoint direction and change the screen.
- Visual notification such as flashing, delay of viewpoint update time, discontinuity of viewpoint update, etc. may be performed.
- the presence or absence of the lesion is determined by the operator according to the following method. You may notify me ⁇ .
- reference numerals 280 to 285 denote observations of the interior of the luminal organ while updating the viewpoint position by virtual endoscopy.
- the viewpoint position is updated in order from 280 to 285.
- 286 indicates a lesion formed in a luminal organ.
- the virtual endoscopic images 280 to 282 gradually approach the lesion, pass through the disease between 282 and 283, and gradually gradually move away from 283 to 285.
- a sound is generated when the viewpoint position of the virtual endoscopic image approaches the lesion.
- the volume gradually increases as the lesion approaches, and gradually decreases as the lesion power increases. Also, when passing through the lesion, the sound frequency is converted, and the operator is notified that the vehicle has passed the lesion by the Doppler effect.
- a lesion area is detected by using any one of the lesion detection methods described in the first to fourth embodiments, and their positional information is stored.
- the viewpoint is advanced while creating a virtual endoscopic image by virtual endoscopy using the stored position information.
- the operator initializes the viewpoint position of the virtual endoscope.
- the initialization is the setting of the start position when performing observation with a virtual endoscope.
- the start position is set, for example, by the operator using an input device such as a mouse 18 and clicking one point in a luminal organ to be observed on a tomographic image such as a CT or MR.
- the CPU 11 creates a virtual endoscope image and displays it on the CRT 13 using the display memory 12.
- the CPU 11 acquires the current viewpoint coordinates ( ⁇ , ⁇ , ⁇ ) and stores them in the main memory 14.
- the CPU 11 acquires the distance value Li between the viewpoint position and the lesion position and stores it in the main memory 14. (Step S294)
- the CPU 11 roughly sets the distance value Li obtained above and compares it with the threshold value T. If Li ⁇ T, that is, if the distance between the viewpoint and the lesion is short, the process proceeds to step S295. If the distance between the viewpoint and the disease is far, the process proceeds to step S29B.
- the CPU 11 compares the distance value Li obtained above with the distance Li-1 between the immediately preceding viewpoint and the lesion. If Li ⁇ Li ⁇ 1, that is, if the viewpoint is close to the lesion, the process proceeds to step S296. On the other hand, if Li> Li'l, that is, if the viewpoint is far from the lesion force, the process proceeds to step S298.
- the CPU 11 sets a pitch. If the viewpoint is close to the lesion, set the treble. This is because the examiner moves the viewpoint by giving a low tone when the viewpoint moves away from the lesion and switching between high and low sounds when the viewpoint passes through the lesion and using the Doppler effect to generate sound. The point of view is to inform the user that he is passing near the lesion.
- the CPU 11 sets the volume. If the viewpoint is close to the lesion, increase the volume. This is to notify the examiner that the viewpoint is approaching the lesion due to the increase in volume.
- the CPU 11 sets the pitch to low. This indicates that the viewpoint is in a state of increasing the distance and the distance due to the Doppler effect.
- the CPU 11 sets the volume.
- the volume of the viewpoint is reduced because the lesion power is increasing.
- the CPU 11 generates the sound of the pitch and volume set in steps S296 and S297 or steps S298 and S299 through the speaker 15.
- the operator updates the viewpoint of the virtual endoscope. For example, the viewpoint of the virtual endoscope moves forward when the left button of the mouse is kept pressed, and the viewpoint moves backward when the right button is kept pressed. Move the mouse to change the direction of the viewpoint. This makes it possible to update the viewpoint.
- the process returns to step S291 to display a virtual endoscopic image and generate a sound for notifying the presence of a lesion.
- the method of audibly notifying the lesion area has been described.
- This Each of these notification methods may be used alone or may be used in combination.
- By using a variety of notification methods even if the examiner's attention is reduced due to fatigue, etc., lesions formed inside luminal organs such as the trachea, bronchi, blood vessels, and intestinal tract are overlooked. It is possible to alleviate the problem.
- the part to be observed may be stored, and when approaching the area during virtual endoscopic observation, the operator may be notified by a visual or auditory method as described in the fifth and sixth embodiments. .
- a visual or auditory method as described in the fifth and sixth embodiments.
- the observed! / ⁇ site may be stored by clicking the mouse 18 on a tomographic image such as CT or MR.
- a 3D image of an organ or tissue to be observed is created from a tomographic image such as a CT or MR, and the position is acquired by clicking the mouse 18 on the created 3D image, and is stored. Good.
- FIG. 30 shows a display example of the present invention. This is a user interface common to the seventh to eleventh embodiments described below.
- the medical image diagnosis support device includes a CPU 11 that performs region extraction calculation, a magnetic disk 16 that receives and stores medical tomographic images captured by the medical image device 1B via a network such as LAN1A 16, and a region extraction.
- Main memory 14 for storing medical tomographic image data and the progress of the calculation during calculation, mouse 18 and keyboard 19 connected to controller 17 for the operator to input parameters necessary for region extraction, and region extraction result table
- a display memory 12 and a display device (CRT) 13 are used for display.
- FIG. 32 shows a processing flow of the seventh embodiment.
- the entire bronchus is extracted by extracting cross sections orthogonal to the direction of travel of the bronchi and collecting them.
- the operator operates an input device such as the mouse 18 and the keyboard 19 to set the user interface.
- an input device such as the mouse 18 and the keyboard 19 to set the user interface.
- a medical tomographic image captured by the modality IB is input to the main memory 14 via the LAN 1A or from the magnetic disk 16.
- the operator operates an input device such as a mouse 18 or a keyboard 19 to move a slice number setting scroll bar 201, a window level setting scroll bar 202, and a window width setting scroll bar 203 so that a desired slice is displayed in the image display area 204. Display the image.
- a numerical value may be directly input to the slice number display area 205, the window level display area 206, and the window width display area 207 in addition to the setting of the display slice.
- the operator operates input devices such as the mouse 18 and the keyboard 19 to set the values of the parameters used for bronchial extraction.
- the parameters used for bronchial extraction include, for example, a threshold value used for two-valued shading, an ROI setting threshold value, an area threshold value, a circularity threshold value, and an area ratio used for determining whether a region is correctly extracted. These include thresholds, constriction thresholds, and edge ratio thresholds. Details of each parameter used for bronchial extraction will be described later.
- each parameter value may be displayed on the user interface. The display of each parameter value on the user interface can be omitted.
- the operator operates an input device such as the mouse 18 or the keyboard 19 to select a starting point of region extraction on an arbitrary image among the inputted medical cross-sectional images.
- the start point of region extraction is a point on the bronchus to be extracted on the input medical tomographic image, and is set, for example, as a point 216 on the image display region in FIG. This is the starting point for the normal area expansion used in the next step.
- the CPU 11 sets an area on the selected medical tomographic image based on the set start point. Expansion is performed to extract a bronchial cross section on the tomographic image.
- the bronchus section extracted here is called the parent area. Assume that a pixel value of 1 is assigned to the parent area and a pixel value of 0 is assigned to the other areas.
- the CPU 11 calculates physical quantities such as the area value Sp, the circularity Cp, the center of gravity coordinate Gp, and the radius rp (or the length of the long axis of the rectangle surrounding the parent area) when the area is approximated by a circle with respect to the parent area.
- the circularity C is, for example, an amount given by Expression (2), and is an index indicating how close the target area is to the shape of a circle.
- S is the area value of the region of interest
- L is the length of the outer periphery of the region of interest.
- the CPU 11 determines whether or not each of the obtained physical quantities such as the area value and the circularity of the parent area is appropriate for each bronchial section.
- the cross-sectional image orthogonal to the direction of travel of the bronchi is considered to be close to a circle and elliptical. Therefore, when the circularity is measured, it becomes a value close to 1. Therefore, the determination as to whether or not it is appropriate as a bronchial cross section is made by comparing each of the set thresholds with each of the obtained physical quantities and determining whether each of the physical quantities satisfies the conditions set by the thresholds. If any one of the physical quantities in the parent area does not satisfy the condition, the process proceeds to step 306. If all the physical quantities satisfy the condition, the process proceeds to step 307.
- the CPU 11 determines or updates the direction vector for extracting the next lumen organ cross section.
- the point moved by the unit length in the direction of the updated direction of the center of gravity Gp force of the parent area is set as the temporary center 0 of the area to be extracted next.
- the CPU 11 determines a radius R of a region of interest (hereinafter referred to as ROI) when performing region extraction.
- the radius R is a constant multiple of the calculated radius of the parent region (or the length of the long axis of the rectangle surrounding the parent region) r.
- the ROI changes the radius within a certain range until it finds the right one.
- the appropriate ROI will be explained later.
- the defined range is, for example, a range represented by ar ⁇ R ⁇ br using the parent region radius r.
- the CPU 11 sets an ROI having a radius R that is orthogonal to the direction vector about the set temporary center 0, and creates an image in the ROI using the input medical tomographic image. Note that the CPU 11 performs an interpolation operation using a linear interpolation method or the like between the slices of the input medical tomographic image in order to assign a pixel value to each pixel in the ROI.
- the CPU 11 binarizes the obtained in-ROI image using the set threshold value of the binary value.
- the CPU 11 assigns a pixel value of 1 to a bronchial region candidate satisfying the threshold condition, and assigns a pixel value of 0 to other regions.
- the CPU 11 leaves only the area connected to the parent area among the areas in the obtained binary image, and assigns a pixel value of 0 to the other areas.
- the CPU 11 determines whether the set ROI is appropriate. As shown in FIG. 32, consider a case where a bronchial candidate region 401 is obtained by binarizing an ROI 400 and a case where a bronchial candidate region 402 is obtained.
- the length of the outer circumference of the ROI 400 and the ROI 401 is LROI
- the length of the outer circumference of the ROI that is not in contact with the bronchial candidate areas 402 and 403 obtained by binary diction is L402 and L403, respectively.
- the circumference of the ROI, LROI is in contact with the bronchial candidate area on the outer circumference of the ROI
- the ratio RL of the part lengths L402, L403 is the ROI setting threshold TROI, which is one of the set area extraction parameters.
- the CPU 11 determines the area Sc and the circularity Cc of the bronchial candidate area (hereinafter referred to as a child area candidate) obtained by dichotomy in the ROI determined to be appropriate, the area ratio with the parent area RS, the degree of deformation (constriction Degree) Calculate W, edge ratio E and barycenter coordinate Gc.
- the CPU 11 compares the calculated physical quantities such as the area Sc of the child area candidate, the circularity Cc, the area ratio Rs with the parent area, the degree of constriction W, and the edge ratio E with the set area extraction parameters, and the bronchial area. Is determined as appropriate.
- the area extraction parameters used here are area threshold TS, circularity threshold TC, area ratio threshold TR1 (lower), TR2 (upper), constriction threshold TW, and edge ratio threshold TE.
- the area threshold value TS is used to exclude a region of an inappropriate size as a bronchial region cross section when it is found.
- the area ratio thresholds TR1 and TR2 are parameters for ensuring continuity with the parent area.
- the constriction threshold is used to repel, for example, when a part of the lung field region is extracted as a child region candidate as a child region candidate, as in a region 503 in FIG.
- the edge ratio is the same as the degree of constriction. If all of these physical quantities satisfy the threshold condition, the candidate child region is adopted as the bronchial region cross section.
- the adopted child area candidate is called a child area.
- a pixel value of 0 is assigned to the child area candidate not adopted.
- all physical quantities may be used to determine whether or not the child area candidate has an appropriate force as a bronchial area cross section, or a part of the physical quantities may be used.
- the CPU 11 determines that it is appropriate as the bronchial region by the child region candidate determination process in step 314. Count the number of determined child areas.
- the CPU 11 determines whether or not the obtained number of child areas is 0. If the number of child areas is not 0, the CPU 11 proceeds to step 317, and if the number of child areas is 0, proceeds to step 321.
- the CPU 11 determines whether or not the obtained child region number power S1 is two or more. If the number of child regions is two or more, the process proceeds to step 318. If the number of child regions is one, the process proceeds to step 320.
- the CPU 11 determines that the child region is a bronchial bifurcation when the number of child regions is ⁇ or more, and at this time the barycentric coordinates Gp, area Sp, circularity Cp, coordinates of each pixel in the parent region, and direction of the parent region.
- the variables such as the vector and the radius R of the ROI are stored in the branch information storage array on the main memory 14.
- the CPU 11 increases the total number of bronchial branches Nt by (the number of child regions ⁇ 1).
- the total number of bronchial branches Nt is the total number of bronchial branches found by branching in the extraction process, and is used to terminate the entire bronchial extraction process when the extraction process is completed for all branches.
- the CPU 11 stores the coordinates of each pixel in the extracted child area in the extracted area coordinate storage array.
- the CPU 11 determines whether ROIs have been created for all ROI radii and the child area candidate extraction processing has been performed, and the extraction processing has been completed for all ROI radii. If so, the process proceeds to step 322, where ROIs are created for all the ROI radii. If not, the process returns to step 308 to update the ROI radii and perform extraction processing.
- the CPU 11 determines whether or not the extraction processing has been performed for all the angle direction vectors. If the extraction processing has been completed for all the angle direction vectors, the process proceeds to step 323. At this point, the extraction process is terminated, and if not, the process returns to step 307 to update the direction vector.
- a vector 705 defined by angles 703 and 704 is obtained from a center of gravity 701 of a parent region 700 to a vector 702 orthogonal to the parent region.
- the resulting vector 705 is all directional vectors.
- the angles 703 and 704 may be taken from 0 ° to 360 ° in increments of 10 °.
- angle 703 may be taken in 5 ° steps and angle 704 may be taken in 10 ° steps.
- angle 703 may range from 0 ° to 90 ° and the angle 704 may range from 0 ° to 360 °.
- the CPU 11 increases the value of the extracted branch number Nf by one.
- the number of extracted branches refers to the number of branches that have already been extracted from the branches of the bronchi.
- the CPU 11 sets the barycentric coordinates Gc, area Sc, and circularity Cc of the child area extracted in step 314 as new parent area barycentric coordinates Gp, area Sp, and circularity Cp, and returns to step 307 to repeat the extraction process.
- the child region is extracted. If not, the parent region information is obtained from the branch information storage array, and the branch process different from the branch that has been subjected to the extraction process is started. I do.
- the CPU 11 creates a three-dimensional image of the bronchus extracted based on the information stored in the extraction region information storage array, and displays it on a display using a display memory.
- the extracted bronchi 3D image is displayed in the image display area 800 on the user interface as shown in FIG. 36, for example.
- a circular ROI was used.
- ROIs other than circular may be used.
- a straight line 902 is drawn outward from the center of gravity 901 of the parent area 900.
- a point 905 is drawn on a straight line at a fixed distance 904 from a point 903 where the straight line intersects the outer periphery of the parent area. It is obtained by rotating the straight line 902 by 360 ° with the center of gravity of the parent region 901 as a fulcrum.
- the line 906 connecting the point clouds may be used as the outer circumference of the ROI!
- the distance 904 is set to an arbitrary size in step 321.
- a rectangular area 908 surrounding the parent area 907 is obtained, and its long side length 909 and short side length are obtained.
- a rectangular area 913 having a long side of length 911 and a short side of length 912 obtained by respectively multiplying the length 909 of the long side 909 and the length 910 of the short side of the rectangular area 908 by ⁇ may be used as the ROI.
- An ellipse 916 with 914 and a short axis 915 may be obtained, and this may be used as the ROI.
- the ROI setting threshold TROI used for determining whether the ROI is appropriate is fixed to a constant value, but the ROI setting threshold may be made variable.
- a direction vector 702 orthogonal to the parent region from the center of gravity 701 of the parent region 700 in FIG. 35 is used as a reference direction vector, and an ROI is created while changing an angle 703 from the reference direction vector 702. I do.
- the ROI setting threshold may be changed for each angle 703. The variable ROI setting threshold will be described with reference to FIG.
- the operator gives the lower limit value TROI1 and the upper limit value TROI2 of the ROI setting threshold. These values may be set by the operator directly on the user interface, or of course, the values set internally may be retained.
- FIG. 39 shows the order of extracting bronchial branches according to the seventh embodiment.
- the bronchi have a tree structure as shown in Fig. 39, and extraction is performed from the top to the bottom of the tree structure.
- branch information is stored upon reaching the branch, and the extraction is further advanced toward the bottom.
- the branch on the left side is preferentially extracted at the branch part.
- the branch being extracted reaches the bottom, it returns to the earliest stored branch of the stored branch information, and proceeds to the branch on the opposite side from the one previously extracted. In other words, even if it reaches the bifurcation, the extraction is not stopped there, and the extraction of one branch is completed up to the bottom and proceeds to the next branch.
- Fig. 39 the order in which the force up to the bifurcation point is extracted is shown as a numerical value.
- the order in which each branch point is extracted is represented by a circled number.
- the ROI setting threshold value TROI ( ⁇ ) of the direction vector 1003 inclined at an angle of ⁇ from the reference direction vector with the angle 1001 as ⁇ max may be given by equation (3).
- TROI ( ⁇ ) TR0I1 + (TROI2-TROI1) X sin ⁇ / sin ⁇ max (3)
- the angle 1001 may be set to (max, and the ROI setting threshold value TROI (0) of the direction vector 1003 may be given as in equation (4).
- TROI ( ⁇ ) TROIl + (TROI2-TROI1) X ⁇ / ⁇ max (4)
- the ROI setting threshold may be determined in proportion to the parent region area value Sp.
- the operator gives the lower limit value TROI1 and the upper limit value TROI2 of the ROI setting threshold. These values may be set by the operator directly on the user interface, or of course, the values set internally may be retained.
- the ROI setting threshold is TTROI2.
- the ROI setting threshold value TROI (Sp) may be given as in equation (5).
- TROI (Sp) TROIl + (TROI2-TROI1) X (Sp—SI) / (S2—SI) (5)
- SI and S2 may be given by the operator or set inside You may keep it.
- the ROI setting threshold value TROI given by the above formulas (4) and (5) can be calculated at high speed using the above formula (4) or (5), and the calculation speed can be increased by performing a lookup table. It is possible to do a dagger. Further, the ROI setting threshold value TROI obtained empirically without using the above equations (4) and (5) may be made into a reference table.
- the eighth embodiment differs from the seventh embodiment in the order in which the bronchial branches are extracted.
- the symmetry between the left and right sides during extraction is taken into account.
- the order of extracting the branches of the bronchi in the eighth embodiment will be described.
- the eighth embodiment once the branch is reached! /, The extraction in that direction is stopped and the branch information is stored. Return to the previous branch point and extract branches in different directions. Each time a branch point is reached, extraction is performed by returning to the branch point in the next higher hierarchy, so that symmetry can be extracted during extraction.
- the order in which each branch is extracted in the eighth embodiment is represented by numerical values in FIG.
- the order in which each branch point is extracted is indicated by a circled number.
- each bronchial branch can be extracted with good symmetry for each hierarchy.
- the extraction result is displayed after the extraction processing of all areas is completed.
- the extraction process may be displayed three-dimensionally during extraction.
- the extraction end button When the operator presses the extraction end button while observing the extraction progress, the extraction may be ended at that point.
- the extraction time can be shortened by terminating the extraction halfway.
- each peripheral branch is used as a bronchus, and the extraction process is continued until an appropriate region does not exist.
- the area termination condition threshold value TSfin is set and the child region area being extracted becomes Scfin and TSfin, the branch extraction processing is terminated, and the process returns to the branch point and extraction is still performed. You can do it.
- the settings of TSfin may be defined internally at first glance, or the operator may directly input a numerical value into the area termination condition threshold display area on the user interface.
- the bronchial core from the bronchial region extracted in the seventh and eighth embodiments.
- the extraction was performed while obtaining the cross sections 1301-1131 perpendicular to the bronchial running direction.
- the barycentric coordinates 1312-1322 of each cross section were obtained.
- the line 1323 connecting the acquired barycentric coordinates 1312—1322 can be regarded as the core line.
- a straight line may be connected between adjacent barycenters. Alternatively, they may be connected by interpolation such as spline interpolation.
- the CPU 11 extracts a blood vessel with an X-ray image 1400 of the head taken. After extracting the initial parent area as in the first embodiment, the next area is extracted using the parent area information.
- An image 1401 is an enlarged image of an area 1402 in the image 1400, and extracts a blood vessel 1403.
- the CPU 11 extracts the next region using the blood vessel portion on the line segment 1404 as a parent region.
- the concentration distribution on line segment 1404 is given as shown at 1405.
- the CPU 11 obtains a circle having a radius 1407 from the midpoint 1406 of the line segment 1404.
- the CPU 11 obtains a density distribution 1409 at a tangent 1408 at an arbitrary point on the circumference.
- the CPU 11 obtains the tangent lines 1410 and 1411 at each point on the circumference and obtains the corresponding density distributions 1412 and 1413.
- the CPU 11 finds a correlation between the density distribution 1405 of the parent area and each of the density distributions 1409, 1412, and 1413 on the tangent, and adopts the one having the largest correlation with the density distribution 1405 of the parent area as the child area. .
- a region on the tangent line 1410 having the density distribution 1412 is adopted as a child region. It is possible to extract part or all of the head blood vessels by repeatedly applying the above processing with the adopted child area as the parent area.
- the already extracted region may be extracted again.
- the extraction direction is 1500 and there is a parent area on tangent 1501
- the density distributions on tangents 1501 and 1502 are like 1503 and 1504, respectively. Since the correlation between these concentration distributions is high, an area on the tangent 1502 which should have been extracted as a child area can be adopted. Therefore, the direction in which the tangent is drawn on the circumference is limited to, for example, an area that is equal to or more than the right and left ⁇ ° of the line connecting the center of the parent area and the center of the area extracted immediately before the parent area.
- the bronchi are described as an example, and in the eleventh embodiment, blood vessels are described as an example.
- the present invention is applicable to all luminal organs such as the bronchi, blood vessels, and intestinal tract.
- the modalities that captured the images were X-ray CT, MRI, and ultrasound. It is not limited to such devices. For example, when the density value changes gradually over the entire image, such as an image taken by an MRI tomography apparatus, the threshold value and the force that needs to be changed accordingly must be applied to the present invention. It is possible.
- the luminal organ can be extracted by one extraction process without performing a thin line shading process after extraction to determine the center line again. Both the organ and luminal organ core lines can be extracted, and the direction vector at each point on the core line can be obtained.
- the present invention is capable of selectively diagnosing only a portion of an organ part whose shape has changed due to a disease, and notifying the operator of the change in the shape of the diagnosed part by visual sense such as image display, sound or voice, etc. As a result, the throughput of diagnosis can be improved.
- an organ region is appropriately extracted based on the set threshold value, and a more accurate three-dimensional image can be formed from the extracted organ region.
Landscapes
- Life Sciences & Earth Sciences (AREA)
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Medical Informatics (AREA)
- Biomedical Technology (AREA)
- Surgery (AREA)
- High Energy & Nuclear Physics (AREA)
- Physics & Mathematics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Optics & Photonics (AREA)
- Pathology (AREA)
- Radiology & Medical Imaging (AREA)
- Veterinary Medicine (AREA)
- Heart & Thoracic Surgery (AREA)
- Molecular Biology (AREA)
- Biophysics (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Human Computer Interaction (AREA)
- Apparatus For Radiation Diagnosis (AREA)
- Measuring And Recording Apparatus For Diagnosis (AREA)
- Magnetic Resonance Imaging Apparatus (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
- Image Processing (AREA)
Abstract
Description
Claims
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US10/566,666 US7894646B2 (en) | 2003-08-01 | 2004-07-29 | Medical image diagnosis support device and method for calculating degree of deformation from normal shapes of organ regions |
| JP2005512511A JP4416736B2 (ja) | 2003-08-01 | 2004-07-29 | 医用画像診断支援装置及びプログラム |
Applications Claiming Priority (6)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2003-284919 | 2003-08-01 | ||
| JP2003284919 | 2003-08-01 | ||
| JP2003-313424 | 2003-09-05 | ||
| JP2003313424 | 2003-09-05 | ||
| JP2004117734 | 2004-04-13 | ||
| JP2004-117734 | 2004-04-13 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2005011501A1 true WO2005011501A1 (ja) | 2005-02-10 |
Family
ID=34119572
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2004/010835 Ceased WO2005011501A1 (ja) | 2003-08-01 | 2004-07-29 | 医用画像診断支援装置及び方法 |
Country Status (3)
| Country | Link |
|---|---|
| US (1) | US7894646B2 (ja) |
| JP (1) | JP4416736B2 (ja) |
| WO (1) | WO2005011501A1 (ja) |
Cited By (40)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2005224460A (ja) * | 2004-02-16 | 2005-08-25 | Hitachi Medical Corp | 医用画像診断装置 |
| JP2006314790A (ja) * | 2005-05-13 | 2006-11-24 | Tomtec Imaging Syst Gmbh | 2次元断面画像を再構築するための方法及び装置 |
| JP2007020731A (ja) * | 2005-07-13 | 2007-02-01 | Matsushita Electric Ind Co Ltd | 超音波診断装置 |
| JP2007044121A (ja) * | 2005-08-08 | 2007-02-22 | Hitachi Medical Corp | 医用画像表示方法及び装置 |
| JP2007117108A (ja) * | 2005-10-24 | 2007-05-17 | Med Solution Kk | 器官の状態変化を評価する装置およびプログラム |
| JP2007282945A (ja) * | 2006-04-19 | 2007-11-01 | Toshiba Corp | 画像処理装置 |
| WO2007122896A1 (ja) | 2006-03-29 | 2007-11-01 | Hitachi Medical Corporation | 医用画像表示システム及び医用画像表示プログラム |
| JP2008000270A (ja) * | 2006-06-21 | 2008-01-10 | Aze Ltd | 生体組織の識別画像作成方法、装置およびプログラム |
| JP2008036284A (ja) * | 2006-08-09 | 2008-02-21 | Toshiba Corp | 医用画像合成方法及びその装置 |
| WO2008152938A1 (ja) * | 2007-06-11 | 2008-12-18 | Hitachi Medical Corporation | 医用画像表示装置、医用画像表示方法及び医用画像表示プログラム |
| JP2009153677A (ja) * | 2007-12-26 | 2009-07-16 | Konica Minolta Medical & Graphic Inc | 動態画像処理システム |
| JP2009165615A (ja) * | 2008-01-16 | 2009-07-30 | Fujifilm Corp | 腫瘍領域サイズ測定方法および装置ならびにプログラム |
| JP2009183508A (ja) * | 2008-02-07 | 2009-08-20 | Hitachi Medical Corp | 画像診断支援装置 |
| JP2009195585A (ja) * | 2008-02-25 | 2009-09-03 | Toshiba Corp | 超音波診断装置、及び超音波診断装置の制御プログラム |
| JP2010504794A (ja) * | 2006-09-29 | 2010-02-18 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | 突起検出方法、システム及びコンピュータプログラム |
| JP2010075549A (ja) * | 2008-09-26 | 2010-04-08 | Toshiba Corp | 画像処理装置 |
| JP2010082374A (ja) * | 2008-10-02 | 2010-04-15 | Toshiba Corp | 画像表示装置及び画像表示方法 |
| JP2010158452A (ja) * | 2009-01-09 | 2010-07-22 | Fujifilm Corp | 画像処理装置および方法並びにプログラム |
| JP2010167188A (ja) * | 2009-01-26 | 2010-08-05 | Toshiba Corp | 医用画像診断装置、画像データ出力装置及び画像データ出力用制御プログラム |
| US7853304B2 (en) | 2005-05-13 | 2010-12-14 | Tomtec Imaging Systems Gmbh | Method and device for reconstructing two-dimensional sectional images |
| JP2011135937A (ja) * | 2009-12-25 | 2011-07-14 | Toshiba Corp | 医用画像処理装置、医用画像処理プログラム及び医用画像診断装置 |
| JP2011139767A (ja) * | 2010-01-06 | 2011-07-21 | Toshiba Corp | 医用画像の表示装置及び表示方法 |
| JP2011139821A (ja) * | 2010-01-08 | 2011-07-21 | Toshiba Corp | 医用画像診断装置 |
| JP2011530363A (ja) * | 2008-08-11 | 2011-12-22 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | 細長い要素の変形の識別 |
| JP2012040207A (ja) * | 2010-08-19 | 2012-03-01 | Toshiba Corp | 超音波診断装置、超音波診断装置の制御プログラム、及び画像処理装置 |
| JP2012509133A (ja) * | 2009-09-10 | 2012-04-19 | インフィニット ヘルスケア カンパニー リミテッド | 仮想内視鏡装置とその駆動方法及び検診装置 |
| JP2012110549A (ja) * | 2010-11-26 | 2012-06-14 | Fujifilm Corp | 医用画像処理装置および方法、並びにプログラム |
| JP2013517914A (ja) * | 2010-01-28 | 2013-05-20 | ラドロジックス, インコーポレイテッド | 医療画像を分析、優先順位付与、視覚化、および報告するための方法およびシステム |
| WO2015037510A1 (ja) * | 2013-09-10 | 2015-03-19 | 日立アロカメディカル株式会社 | 超音波診断装置 |
| JP2017170172A (ja) * | 2010-10-19 | 2017-09-28 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | 医用画像システム |
| JP2018042627A (ja) * | 2016-09-12 | 2018-03-22 | キヤノンメディカルシステムズ株式会社 | 医用画像診断装置及び医用画像処理装置 |
| JP2018183589A (ja) * | 2017-04-25 | 2018-11-22 | バイオセンス・ウエブスター・(イスラエル)・リミテッドBiosense Webster (Israel), Ltd. | 狭い通路における侵襲的手順の内視鏡画像 |
| JP2019005034A (ja) * | 2017-06-22 | 2019-01-17 | 株式会社根本杏林堂 | 医用画像処理装置、医用画像処理システムおよび医用画像処理方法 |
| JP2020096773A (ja) * | 2018-12-19 | 2020-06-25 | 富士フイルム株式会社 | 医用画像処理装置、方法およびプログラム |
| JP2020146200A (ja) * | 2019-03-13 | 2020-09-17 | キヤノン株式会社 | 画像処理装置、画像閲覧装置および画像処理システム |
| JPWO2019078102A1 (ja) * | 2017-10-20 | 2020-10-22 | 富士フイルム株式会社 | 医療画像処理装置 |
| JPWO2021029292A1 (ja) * | 2019-08-13 | 2021-02-18 | ||
| JP2022046808A (ja) * | 2017-06-22 | 2022-03-23 | 株式会社根本杏林堂 | 医用画像処理装置、医用画像処理システムおよび医用画像処理方法 |
| US11475568B2 (en) | 2018-05-16 | 2022-10-18 | Panasonic Holdings Corporation | Method for controlling display of abnormality in chest x-ray image, storage medium, abnormality display control apparatus, and server apparatus |
| JP2024072310A (ja) * | 2022-11-16 | 2024-05-28 | コニカミノルタ株式会社 | プログラム、表示装置、表示システム及び表示方法 |
Families Citing this family (32)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US7616801B2 (en) | 2002-11-27 | 2009-11-10 | Hologic, Inc. | Image handling and display in x-ray mammography and tomosynthesis |
| WO2006058160A2 (en) | 2004-11-26 | 2006-06-01 | Hologic, Inc. | Integrated multi-mode mammography/tomosynthesis x-ray system and method |
| US8565372B2 (en) | 2003-11-26 | 2013-10-22 | Hologic, Inc | System and method for low dose tomosynthesis |
| US7123684B2 (en) | 2002-11-27 | 2006-10-17 | Hologic, Inc. | Full field mammography with tissue exposure control, tomosynthesis, and dynamic field of view processing |
| US10638994B2 (en) | 2002-11-27 | 2020-05-05 | Hologic, Inc. | X-ray mammography with tomosynthesis |
| US7577282B2 (en) | 2002-11-27 | 2009-08-18 | Hologic, Inc. | Image handling and display in X-ray mammography and tomosynthesis |
| JP3932303B2 (ja) * | 2005-05-13 | 2007-06-20 | 独立行政法人放射線医学総合研究所 | 臓器動態の定量化方法、装置、臓器位置の予測方法、装置、放射線照射方法、装置及び臓器異常検出装置 |
| JP4388104B2 (ja) * | 2007-06-29 | 2009-12-24 | ザイオソフト株式会社 | 画像処理方法、画像処理プログラム及び画像処理装置 |
| JP4541434B2 (ja) * | 2008-07-14 | 2010-09-08 | ザイオソフト株式会社 | 画像処理装置及び画像処理プログラム |
| US20100063842A1 (en) * | 2008-09-08 | 2010-03-11 | General Electric Company | System and methods for indicating an image location in an image stack |
| WO2010055817A1 (ja) * | 2008-11-13 | 2010-05-20 | 株式会社 日立メディコ | 画像処理装置及び画像処理方法 |
| CN102576471B (zh) * | 2008-12-18 | 2015-09-23 | 皇家飞利浦电子股份有限公司 | 生成医学图像的视图 |
| GB2475722B (en) * | 2009-11-30 | 2011-11-02 | Mirada Medical | Measurement system for medical images |
| US9378331B2 (en) | 2010-11-19 | 2016-06-28 | D.R. Systems, Inc. | Annotation and assessment of images |
| JP5971682B2 (ja) * | 2011-03-02 | 2016-08-17 | 東芝メディカルシステムズ株式会社 | 磁気共鳴イメージング装置 |
| JP6039903B2 (ja) * | 2012-01-27 | 2016-12-07 | キヤノン株式会社 | 画像処理装置、及びその作動方法 |
| JP6426144B2 (ja) * | 2013-03-19 | 2018-11-21 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | 医療システムに対する聴覚に関する機能強化 |
| US10127662B1 (en) | 2014-08-11 | 2018-11-13 | D.R. Systems, Inc. | Systems and user interfaces for automated generation of matching 2D series of medical images and efficient annotation of matching 2D medical images |
| US11076820B2 (en) | 2016-04-22 | 2021-08-03 | Hologic, Inc. | Tomosynthesis with shifting focal spot x-ray system using an addressable array |
| JP6955303B2 (ja) * | 2017-04-12 | 2021-10-27 | 富士フイルム株式会社 | 医用画像処理装置および方法並びにプログラム |
| CN110868907B (zh) * | 2017-04-28 | 2022-05-17 | 奥林巴斯株式会社 | 内窥镜诊断辅助系统、存储介质和内窥镜诊断辅助方法 |
| US10552978B2 (en) * | 2017-06-27 | 2020-02-04 | International Business Machines Corporation | Dynamic image and image marker tracking |
| JP6738305B2 (ja) * | 2017-06-30 | 2020-08-12 | 富士フイルム株式会社 | 学習データ生成支援装置および学習データ生成支援装置の作動方法並びに学習データ生成支援プログラム |
| EP3668404B1 (en) | 2017-08-16 | 2022-07-06 | Hologic, Inc. | Techniques for breast imaging patient motion artifact compensation |
| EP3449835B1 (en) | 2017-08-22 | 2023-01-11 | Hologic, Inc. | Computed tomography system and method for imaging multiple anatomical targets |
| US11090017B2 (en) | 2018-09-13 | 2021-08-17 | Hologic, Inc. | Generating synthesized projection images for 3D breast tomosynthesis or multi-mode x-ray breast imaging |
| DE112019004880T5 (de) * | 2018-09-27 | 2021-07-01 | Hoya Corporation | Elektronisches endoskopsystem |
| EP3832689A3 (en) | 2019-12-05 | 2021-08-11 | Hologic, Inc. | Systems and methods for improved x-ray tube life |
| US11471118B2 (en) | 2020-03-27 | 2022-10-18 | Hologic, Inc. | System and method for tracking x-ray tube focal spot position |
| US11786191B2 (en) | 2021-05-17 | 2023-10-17 | Hologic, Inc. | Contrast-enhanced tomosynthesis with a copper filter |
| CN116580071A (zh) * | 2022-01-29 | 2023-08-11 | 佳能医疗系统株式会社 | 医用图像配准方法、医用图像处理装置、存储介质及程序产品 |
| US12414217B2 (en) | 2022-02-07 | 2025-09-09 | Hologic, Inc. | Systems and methods for adaptively controlling filament current in an X-ray tube |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH08336524A (ja) * | 1995-06-13 | 1996-12-24 | Shimadzu Corp | X線画像処理装置 |
| JP2002207992A (ja) * | 2001-01-12 | 2002-07-26 | Hitachi Ltd | 画像処理方法及び画像処理装置 |
| JP2003070781A (ja) * | 2001-09-04 | 2003-03-11 | Hitachi Medical Corp | 医用画像診断支援装置 |
Family Cites Families (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5603318A (en) * | 1992-04-21 | 1997-02-18 | University Of Utah Research Foundation | Apparatus and method for photogrammetric surgical localization |
| JP3679512B2 (ja) * | 1996-07-05 | 2005-08-03 | キヤノン株式会社 | 画像抽出装置および方法 |
| JP2984652B2 (ja) * | 1997-08-22 | 1999-11-29 | 富士通株式会社 | 領域抽出装置及び領域抽出方法並びにコンピュータで実現可能なプログラムが記憶された記録媒体 |
| US6301498B1 (en) * | 1998-04-17 | 2001-10-09 | Cornell Research Foundation, Inc. | Method of determining carotid artery stenosis using X-ray imagery |
| US6901156B2 (en) * | 2000-02-04 | 2005-05-31 | Arch Development Corporation | Method, system and computer readable medium for an intelligent search workstation for computer assisted interpretation of medical images |
| US6643533B2 (en) * | 2000-11-28 | 2003-11-04 | Ge Medical Systems Global Technology Company, Llc | Method and apparatus for displaying images of tubular structures |
| JP2004041694A (ja) * | 2002-05-13 | 2004-02-12 | Fuji Photo Film Co Ltd | 画像生成装置およびプログラム、画像選択装置、画像出力装置、画像提供サービスシステム |
-
2004
- 2004-07-29 US US10/566,666 patent/US7894646B2/en active Active
- 2004-07-29 JP JP2005512511A patent/JP4416736B2/ja not_active Expired - Lifetime
- 2004-07-29 WO PCT/JP2004/010835 patent/WO2005011501A1/ja not_active Ceased
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH08336524A (ja) * | 1995-06-13 | 1996-12-24 | Shimadzu Corp | X線画像処理装置 |
| JP2002207992A (ja) * | 2001-01-12 | 2002-07-26 | Hitachi Ltd | 画像処理方法及び画像処理装置 |
| JP2003070781A (ja) * | 2001-09-04 | 2003-03-11 | Hitachi Medical Corp | 医用画像診断支援装置 |
Cited By (56)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2005224460A (ja) * | 2004-02-16 | 2005-08-25 | Hitachi Medical Corp | 医用画像診断装置 |
| JP2006314790A (ja) * | 2005-05-13 | 2006-11-24 | Tomtec Imaging Syst Gmbh | 2次元断面画像を再構築するための方法及び装置 |
| US7853304B2 (en) | 2005-05-13 | 2010-12-14 | Tomtec Imaging Systems Gmbh | Method and device for reconstructing two-dimensional sectional images |
| JP2007020731A (ja) * | 2005-07-13 | 2007-02-01 | Matsushita Electric Ind Co Ltd | 超音波診断装置 |
| JP2007044121A (ja) * | 2005-08-08 | 2007-02-22 | Hitachi Medical Corp | 医用画像表示方法及び装置 |
| JP2007117108A (ja) * | 2005-10-24 | 2007-05-17 | Med Solution Kk | 器官の状態変化を評価する装置およびプログラム |
| JPWO2007122896A1 (ja) * | 2006-03-29 | 2009-09-03 | 株式会社日立メディコ | 医用画像表示システム及び医用画像表示プログラム |
| WO2007122896A1 (ja) | 2006-03-29 | 2007-11-01 | Hitachi Medical Corporation | 医用画像表示システム及び医用画像表示プログラム |
| US8107701B2 (en) | 2006-03-29 | 2012-01-31 | Hitachi Medical Corporation | Medical image display system and medical image display program |
| JP2007282945A (ja) * | 2006-04-19 | 2007-11-01 | Toshiba Corp | 画像処理装置 |
| JP2008000270A (ja) * | 2006-06-21 | 2008-01-10 | Aze Ltd | 生体組織の識別画像作成方法、装置およびプログラム |
| JP2008036284A (ja) * | 2006-08-09 | 2008-02-21 | Toshiba Corp | 医用画像合成方法及びその装置 |
| JP2010504794A (ja) * | 2006-09-29 | 2010-02-18 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | 突起検出方法、システム及びコンピュータプログラム |
| WO2008152938A1 (ja) * | 2007-06-11 | 2008-12-18 | Hitachi Medical Corporation | 医用画像表示装置、医用画像表示方法及び医用画像表示プログラム |
| JP2009153677A (ja) * | 2007-12-26 | 2009-07-16 | Konica Minolta Medical & Graphic Inc | 動態画像処理システム |
| JP2009165615A (ja) * | 2008-01-16 | 2009-07-30 | Fujifilm Corp | 腫瘍領域サイズ測定方法および装置ならびにプログラム |
| JP2009183508A (ja) * | 2008-02-07 | 2009-08-20 | Hitachi Medical Corp | 画像診断支援装置 |
| JP2009195585A (ja) * | 2008-02-25 | 2009-09-03 | Toshiba Corp | 超音波診断装置、及び超音波診断装置の制御プログラム |
| JP2011530363A (ja) * | 2008-08-11 | 2011-12-22 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | 細長い要素の変形の識別 |
| JP2010075549A (ja) * | 2008-09-26 | 2010-04-08 | Toshiba Corp | 画像処理装置 |
| JP2010082374A (ja) * | 2008-10-02 | 2010-04-15 | Toshiba Corp | 画像表示装置及び画像表示方法 |
| US9214139B2 (en) | 2008-10-02 | 2015-12-15 | Kabushiki Kaisha Toshiba | Image display apparatus and image display method |
| JP2010158452A (ja) * | 2009-01-09 | 2010-07-22 | Fujifilm Corp | 画像処理装置および方法並びにプログラム |
| JP2010167188A (ja) * | 2009-01-26 | 2010-08-05 | Toshiba Corp | 医用画像診断装置、画像データ出力装置及び画像データ出力用制御プログラム |
| JP2012509133A (ja) * | 2009-09-10 | 2012-04-19 | インフィニット ヘルスケア カンパニー リミテッド | 仮想内視鏡装置とその駆動方法及び検診装置 |
| JP2011135937A (ja) * | 2009-12-25 | 2011-07-14 | Toshiba Corp | 医用画像処理装置、医用画像処理プログラム及び医用画像診断装置 |
| JP2011139767A (ja) * | 2010-01-06 | 2011-07-21 | Toshiba Corp | 医用画像の表示装置及び表示方法 |
| JP2011139821A (ja) * | 2010-01-08 | 2011-07-21 | Toshiba Corp | 医用画像診断装置 |
| JP2013517914A (ja) * | 2010-01-28 | 2013-05-20 | ラドロジックス, インコーポレイテッド | 医療画像を分析、優先順位付与、視覚化、および報告するための方法およびシステム |
| JP2018163680A (ja) * | 2010-01-28 | 2018-10-18 | ラドロジックス, インコーポレイテッド | 医療画像を分析、優先順位付与、視覚化、および報告するための方法およびシステム |
| JP2012040207A (ja) * | 2010-08-19 | 2012-03-01 | Toshiba Corp | 超音波診断装置、超音波診断装置の制御プログラム、及び画像処理装置 |
| JP2017170172A (ja) * | 2010-10-19 | 2017-09-28 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | 医用画像システム |
| JP2012110549A (ja) * | 2010-11-26 | 2012-06-14 | Fujifilm Corp | 医用画像処理装置および方法、並びにプログラム |
| US9024941B2 (en) | 2010-11-26 | 2015-05-05 | Fujifilm Corporation | Sequentially displaying virtual endoscopic images by setting an observation path to compensate for a curved region of the tubular structure |
| JP2015053957A (ja) * | 2013-09-10 | 2015-03-23 | 日立アロカメディカル株式会社 | 超音波診断装置 |
| WO2015037510A1 (ja) * | 2013-09-10 | 2015-03-19 | 日立アロカメディカル株式会社 | 超音波診断装置 |
| JP2018042627A (ja) * | 2016-09-12 | 2018-03-22 | キヤノンメディカルシステムズ株式会社 | 医用画像診断装置及び医用画像処理装置 |
| JP2018183589A (ja) * | 2017-04-25 | 2018-11-22 | バイオセンス・ウエブスター・(イスラエル)・リミテッドBiosense Webster (Israel), Ltd. | 狭い通路における侵襲的手順の内視鏡画像 |
| JP7505081B2 (ja) | 2017-04-25 | 2024-06-24 | バイオセンス・ウエブスター・(イスラエル)・リミテッド | 狭い通路における侵襲的手順の内視鏡画像 |
| JP2023080220A (ja) * | 2017-04-25 | 2023-06-08 | バイオセンス・ウエブスター・(イスラエル)・リミテッド | 狭い通路における侵襲的手順の内視鏡画像 |
| JP2019005034A (ja) * | 2017-06-22 | 2019-01-17 | 株式会社根本杏林堂 | 医用画像処理装置、医用画像処理システムおよび医用画像処理方法 |
| JP2023110069A (ja) * | 2017-06-22 | 2023-08-08 | 株式会社根本杏林堂 | 医用画像処理装置、医用画像処理システムおよび医用画像処理方法 |
| JP2022046808A (ja) * | 2017-06-22 | 2022-03-23 | 株式会社根本杏林堂 | 医用画像処理装置、医用画像処理システムおよび医用画像処理方法 |
| JP7298949B2 (ja) | 2017-06-22 | 2023-06-27 | 株式会社根本杏林堂 | 医用画像処理装置、医用画像処理システムおよび医用画像処理方法 |
| JP7017220B2 (ja) | 2017-06-22 | 2022-02-08 | 株式会社根本杏林堂 | 医用画像処理装置、医用画像処理システムおよび医用画像処理方法 |
| JPWO2019078102A1 (ja) * | 2017-10-20 | 2020-10-22 | 富士フイルム株式会社 | 医療画像処理装置 |
| JP7059297B2 (ja) | 2017-10-20 | 2022-04-25 | 富士フイルム株式会社 | 医療画像処理装置 |
| US11379977B2 (en) | 2017-10-20 | 2022-07-05 | Fujifilm Corporation | Medical image processing device |
| US11475568B2 (en) | 2018-05-16 | 2022-10-18 | Panasonic Holdings Corporation | Method for controlling display of abnormality in chest x-ray image, storage medium, abnormality display control apparatus, and server apparatus |
| JP2020096773A (ja) * | 2018-12-19 | 2020-06-25 | 富士フイルム株式会社 | 医用画像処理装置、方法およびプログラム |
| JP7317528B2 (ja) | 2019-03-13 | 2023-07-31 | キヤノン株式会社 | 画像処理装置、画像処理システムおよび制御方法 |
| JP2020146200A (ja) * | 2019-03-13 | 2020-09-17 | キヤノン株式会社 | 画像処理装置、画像閲覧装置および画像処理システム |
| WO2021029292A1 (ja) * | 2019-08-13 | 2021-02-18 | 富士フイルム株式会社 | 画像診断支援装置、内視鏡システム、画像診断支援方法、及び画像診断支援プログラム |
| JP7290729B2 (ja) | 2019-08-13 | 2023-06-13 | 富士フイルム株式会社 | 画像診断支援装置、内視鏡システム、画像診断支援装置の作動方法、及び画像診断支援プログラム |
| JPWO2021029292A1 (ja) * | 2019-08-13 | 2021-02-18 | ||
| JP2024072310A (ja) * | 2022-11-16 | 2024-05-28 | コニカミノルタ株式会社 | プログラム、表示装置、表示システム及び表示方法 |
Also Published As
| Publication number | Publication date |
|---|---|
| JPWO2005011501A1 (ja) | 2007-09-27 |
| US7894646B2 (en) | 2011-02-22 |
| US20060280347A1 (en) | 2006-12-14 |
| JP4416736B2 (ja) | 2010-02-17 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP4416736B2 (ja) | 医用画像診断支援装置及びプログラム | |
| US6944330B2 (en) | Interactive computer-aided diagnosis method and system for assisting diagnosis of lung nodules in digital volumetric medical images | |
| CN101036165B (zh) | 用于树模型显像以检测肺栓塞的系统和方法 | |
| JP5031968B2 (ja) | デジタル腸サブトラクションおよびポリープ検出システムならびに関連技術 | |
| CN102007515B (zh) | 用于对肺动脉进行分割的方法和系统 | |
| EP2244633A2 (en) | Medical image reporting system and method | |
| JP2008510499A (ja) | 解剖学的可視化/測定システム | |
| JP2008521461A (ja) | 知識構造のマッピングを使用した管状器官を測定する方法 | |
| US12051156B2 (en) | System and method for linking a segmentation graph to volumetric data | |
| US20140257114A1 (en) | Image processing apparatus, image processing method, and computer-readable recording device | |
| JP2008520317A (ja) | 医療画像データ内の腫瘍境界を自動的に検出及び区分するシステム及び方法 | |
| JP2006246941A (ja) | 画像処理装置及び管走行トラッキング方法 | |
| CN113470060A (zh) | 基于ct影像的冠状动脉多角度曲面重建可视化方法 | |
| JP2007061622A (ja) | 気道内腔の直径、気道壁の厚さおよび気管支動脈比を使用して多断面コンピュータ断層撮影(msct)イメージデータの自動的な気道評価を行うためのシステムおよび方法 | |
| JP4169967B2 (ja) | 画像診断支援装置 | |
| JP2010075549A (ja) | 画像処理装置 | |
| Gibbs et al. | 3D path planning and extension for endoscopic guidance | |
| US12462389B2 (en) | Image processing method, image processing program, and image processing device for partitioning a 3D object at narrow parts of shape | |
| JP4336083B2 (ja) | 画像診断支援装置、画像診断支援方法 | |
| US20110285695A1 (en) | Pictorial Representation in Virtual Endoscopy | |
| WO2022176873A1 (ja) | 医療画像処理装置、医療画像処理方法およびプログラム | |
| JP4738236B2 (ja) | 画像表示装置 | |
| JP5001248B2 (ja) | 画像診断支援装置 | |
| Kiraly | 3D image analysis and visualization of tubular structures | |
| JP7783007B2 (ja) | 医用画像処理装置、医用画像処理方法及び医用画像処理プログラム |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
| AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
| WWE | Wipo information: entry into national phase |
Ref document number: 2005512511 Country of ref document: JP |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2006280347 Country of ref document: US Ref document number: 10566666 Country of ref document: US |
|
| 122 | Ep: pct application non-entry in european phase | ||
| WWP | Wipo information: published in national office |
Ref document number: 10566666 Country of ref document: US |