HK1121843A1 - Methods for interactive liver disease diagnosis - Google Patents
Methods for interactive liver disease diagnosis Download PDFInfo
- Publication number
- HK1121843A1 HK1121843A1 HK08109809.4A HK08109809A HK1121843A1 HK 1121843 A1 HK1121843 A1 HK 1121843A1 HK 08109809 A HK08109809 A HK 08109809A HK 1121843 A1 HK1121843 A1 HK 1121843A1
- Authority
- HK
- Hong Kong
- Prior art keywords
- vessel
- edge
- root
- graphical representation
- vessel segment
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/12—Edge-based segmentation
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/24—Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20092—Interactive image processing based on input by user
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30056—Liver; Hepatic
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/41—Medical
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Health & Medical Sciences (AREA)
- Software Systems (AREA)
- General Engineering & Computer Science (AREA)
- Computer Hardware Design (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Quality & Reliability (AREA)
- Computer Graphics (AREA)
- Apparatus For Radiation Diagnosis (AREA)
Abstract
A method and system provided for interactive liver data processing. A vessel system with a plurality of vessel branches is obtained. Centerlines for the vessel branches are extracted and used to construct a graph representation of the vessel system. Each vessel branch in the vessel system can be labeled based on the graphic representation.
Description
The present invention is U.S. patent application Ser. No. 11/105,961 entitled "Liver Disease diagnostic System, Method and graphical User Interface" (Method and graphical User Interface), filed as part of the 2005 4/14 application continuing and priority being given to U.S. provisional patent application 60/693,871 entitled "Interactive Liver Disease diagnostic Method" (Interactive Liver Disease diagnostic Method) filed as 2005 6/24 by 35U.S. C. § 119(e), both of which are incorporated herein by reference in their entirety.
Technical Field
The present invention generally relates to a method and a graphical user interface for medical diagnosis. In particular, the present invention relates to a method and graphical user interface for computer-assisted medical diagnosis and a system incorporating the invention.
Background
Early detection of liver cancer has recently become possible due to rapid technological advances in diagnostic imaging systems. Detection and diagnosis of liver cancer typically involves multiple image acquisitions in a common variety of image data acquisition devices. For example, Computed Tomography (CT) is the most common data acquisition device for early liver cancer detection and diagnosis. When using CT imaging, up to four phases of images may be acquired for diagnostic purposes. The four phases include flat scan CT images, arterial phase images, portal venous phase images, and delayed phase images. Images from other image acquisition devices may also be used when the CT images are insufficient to help reach a diagnosis. Examples of other data acquisition devices include images from Magnetic Resonance Imaging (MRI) or Positron Emission Tomography (PET). When large amounts of data are available, tools are needed to efficiently utilize the data and assist physicians or other medical personnel in increasing throughput.
Drawings
What is claimed and/or described herein is further described by way of exemplary embodiments. These exemplary embodiments are described in detail with reference to the accompanying drawings. These embodiments are non-limiting exemplary embodiments in which like reference numerals represent like structures throughout the several views of the drawings and wherein:
fig. 1 shows an exemplary configuration for a computer-assisted liver disease diagnosis system according to an embodiment of the present invention;
FIG. 2(a) shows an exemplary flow chart of a forced tagging method;
FIG. 2(b) illustrates an alternative exemplary flow chart of a forced tagging method;
FIG. 3(a) shows an exemplary flow chart for constructing an indication graph;
FIG. 3(b) shows an exemplary indication map;
fig. 3(c) shows a focus information table associated with a focus in the indicator map;
FIG. 4(a) illustrates an exemplary flow chart of lesion segmentation, aortic segmentation, and lesion information display;
FIG. 4(b) illustrates an exemplary manner of presenting lesion segmentation results and related information;
fig. 4(c) shows an exemplary display of lesion segmentation results in 3D;
FIG. 4(d) illustrates control of lesion diagnostic information display and information extraction in some exemplary ways;
FIG. 5 illustrates an exemplary treatment planning/pre-operative assessment mechanism;
FIG. 6 illustrates an exemplary flow diagram of interactive liver segmentation;
FIG. 7(a) illustrates an exemplary inventive method of local interactive liver segmentation adaptation based on a liver region manually delineated by a user;
FIG. 7(b) illustrates an exemplary inventive method of local interactive liver segmentation adjustment based on a user manually traced liver boundary;
FIG. 8 shows an exemplary flow chart for interactive vessel analysis;
FIG. 9 illustrates an exemplary flow chart of vessel segmentation;
FIG. 10 shows an exemplary flow chart for separating portal veins from hepatic veins;
fig. 11 shows an exemplary vessel branch VOI for vessel separation;
FIG. 12 illustrates exemplary entry points and end points of a trace path;
FIG. 13 illustrates an exemplary path classified as a portal vein branch and a hepatic vein branch;
FIG. 14 illustrates an exemplary separation of portal and hepatic vein branches connected in addition to tracking paths;
FIG. 15 shows an example of a portal vein and a hepatic vein separately connected by a separation plane;
FIG. 16 shows an example of separation adjustment;
FIG. 17 illustrates an exemplary flow chart of interactive vessel labeling;
FIG. 18 shows an exemplary diagram of a vascular system;
fig. 19 shows an example of a labeled portal vein system.
Detailed Description
The present invention relates to a method and graphical user interface for liver disease diagnosis. Methods and graphical user interfaces are disclosed that facilitate coordinated retrieval of visual and non-visual data related to a patient and liver disease, processing of visual/non-visual data to extract diagnostic information, generation of a hierarchical representation of visual and non-visual diagnostic information, interactive exploration of a hierarchical structure of diagnostic information, and interactive diagnostic processes. Methods and graphical user interfaces for efficiently displaying data in different dimensions are also disclosed.
FIG. 1 illustrates an exemplary system 100 according to an embodiment of the invention. In this exemplary configuration, the system 100 includes a plurality of filters (filter 1108, filter 2112, and filter 3110), a visual data processing mechanism 130, a liver disease diagnosis mechanism 140, and a diagnostic report generation mechanism 128. The system 100 may also include a search engine 104 that retrieves information related to the patient ID 102 from a patient database 106. The search engine 104 may access information stored in the patient database based on the received patient ID 102. The patient database 106 may be a local data store or a remote data store. The patient database 106 may be a single database or multiple databases, which may be located at a single address or distributed across a network at multiple locations. The visual data processing mechanism 130 may further include a data visualization/processing mechanism 114, an automatic liver lesion detection mechanism 116, an interactive liver lesion detection mechanism 118, and a visual diagnostic information extraction mechanism 120. The liver disease diagnosis mechanism 140 may further include a hierarchical representation construction mechanism 122, a diagnosis-by-probe and real-time interactive diagnosis controller 124, and a treatment planning/pre-operative assessment mechanism 126.
One function supported by the visual data processing mechanism 114 is that the user can mark the outline of the detected lesion for further analysis. However, when multi-phase volumetric data, such as CT or MRI, is used for diagnosis, the lesion may be marked multiple times. First, the lesion may span multiple image slices, and the user may double mark on different slices. Second, the lesion may be visible on multiple phases, and the user may mark multiple times on different phases. To avoid any confusion caused by multiple marking, a forced marking method is used. FIG. 2(a) illustrates an exemplary flow chart of a forced tagging method 200-a. The user marks in one phase, as shown in step 202, and its distance to all existing marks (the distance between the peripheral edge of the newly marked contour and the peripheral edge of the contour of the existing marks), whether they were originally marked in the same phase or corresponded to the same phase by other phases, is measured in steps 204, 206. When the existing marker is out of phase with the new marker, the two markers to be compared can be mapped into one phase volume before the distance is calculated.
In step 208, the calculated distance between the newly marked contour and the existing contour is compared to a preset threshold. If there is more than one existing profile in the database, the system compares the newly labeled profile to all existing profiles, as shown in step 210. If the closest distance between the newly marked contour and the existing contour is greater than the preset threshold, the newly marked contour is saved as a new mark in the database at step 212. The system then calculates and records the corresponding mark positions at the other phases at step 214.
If the distance is within a preset threshold, it is likely that the lesion to be marked has been previously marked. Subsequently, at step 216, a warning message may pop up to alert the user. If the user chooses to do so, the lesion marking may be added as a new marking at steps 212, 214. Otherwise, it may be merged with the corresponding existing tag at step 220. If the marked lesion is considered a new lesion, its corresponding position at other phases may be identified as the same lesion using some mapping/registration technique and recorded. An exemplary method for mapping the location of a lesion in different image phases may be the spatio-temporal registration technique disclosed in U.S. patent application serial No. 11/105,961. A complete set of lesion markings, regardless of which phase they are marked at, may be used for each phase and displayed as needed. Fig. 2(b) shows an alternative exemplary flow chart of the forced labeling method. In this alternative embodiment, each phase does not require a complete set of lesion markings, and the marking to be added and the existing marking may be mapped/registered to the same phase prior to calculating the distance (as shown in steps 222 and 224 of FIG. 2 (b)).
The visual data processing mechanism 114 may be used for navigation of the marked lesion. FIG. 3(a) shows an exemplary display according to one embodiment of the present invention. At step 304, the lesion may be presented in 3D. Similarly, at step 306, vascular structures, including but not limited to hepatic and portal veins, may be presented in 3D. The liver parenchyma and liver lobe portions may also be presented in 3D form at step 308. These 3D structures may be displayed in different colors to distinguish from each other. Since the lesion may be marked at different phases, the marked lesion may be mapped to one phase at step 310. A lesion index map 312 may then be generated by overlaying the different structures on a display. The user, 314, may manipulate and process the display according to the instruction graph. For example, at step 316, the 3D rendering view may be rotated, zoomed. Mouse clicking on a lesion presented on the lesion index map may refresh the display of information associated with the clicked lesion. The information may include, but is not limited to, lesion diagnostic information in a lesion information table and slice images of different phases, e.g., corresponding axial, sagittal, and coronal slice images. Fig. 3(b) shows an exemplary lesion index map 301. Fig. 3(c) shows an exemplary lesion information table associated with a lesion.
After marking or selecting a lesion, more detailed analysis may be performed by the visual diagnostic information extraction mechanism 120. Fig. 4 illustrates an exemplary embodiment of diagnostic information extraction. At step 440, the user may segment the boundary of the lesion using automatic segmentation or manual drawing methods. At step 442, the lesion boundary may be overlaid on the original image slice. Corresponding lesion boundaries in other phases may be automatically computed by mapping the segmented boundaries at step 444. At step 446, the mapped boundaries may be overlaid on the corresponding phase image. After lesion segmentation, the aorta may be segmented automatically or manually at step 448. At step 450, diagnostic information may then be extracted from the segmented lesion and aorta, and the information from the multiple phases may be fused according to their respective segmentation results. The diagnostic information may include, but is not limited to, mean and standard deviation of absolute gray scale values of lesions in each phase, gray scale differences between lesions and liver parenchyma in each phase, absolute aortic gray scale values in each phase, gray scale differences between lesions and aorta in each phase, and enhanced appearance of lesions across different phases, etc. At step 452, the information may be displayed in different ways, including, but not limited to, graphics, charts, tables, and text. In step 454, lesion segmentation and aorta segmentation results may be adjusted. After each adjustment, the relevant lesion information may be updated accordingly at step 452. At step 456, the segmented lesion may be rendered in 3D space to show its spatial relationship with respect to, for example, liver and vascular structures. The user may interact with the display by zooming or rotating the display.
Fig. 4(b) shows an example of lesion segmentation and aorta segmentation. The overlay 401 represents the original segmented lesion boundary and the boundary 402 represents the mapped boundary at the other phase. The aorta boundary is displayed in two phases, as shown at 403. Fig. 4(c) shows a 3D display of the lesion. Display 411 is a local view of a 3D display and display 412 is a global view of the same view obtained by zooming the local view. Fig. 4(d) shows an exemplary information display performed at step 452. Display 421 is a graph showing the enhanced changes of the liver 430, aorta 426, and lesion 428 across the phases. Display 422 is a control for manual adjustment of lesion size. The scroll bar may pop up (not shown), allowing the user to adjust the size of the segmented lesion. Display 423 is an adjustment control for the lesion position in the mapping phase. The user can drag and move the calculated lesion boundary in the image to an appropriate position. The position resulting from the drag operation can be used to correct mapping errors. The control 424 is used to adjust the aortic segmentation. If the "adjust aorta information" function 424 is activated, the shape and location of the extracted aorta region can be edited. The results of the adjustments, as well as the corresponding diagnostic information extracted, may be updated accordingly each time an adjustment is made.
A treatment planning/pre-operative assessment mechanism 126 is provided to perform treatment planning and pre-operative assessment for each detected lesion. Fig. 5 shows an exemplary embodiment providing interactive liver extraction correction 501, interactive vessel analysis 502, in terms of efficient display 503 of desired information and virtual surgery enabled functionality 504 for accurate assessment. Treatment plans may be made based on information such as the type of lesion, the attachment of the lesion to the main vessel, and the lobe of the liver at the lesion. A treatment plan may be designed to determine if a remaining portion of the liver would function properly if a portion of the liver were excised.
The interactive liver extraction and correction method 501 facilitates manual correction and guiding of automatic liver segmentation. Liver segmentation is used to extract liver parenchyma from image data. Fig. 6 shows an exemplary embodiment. At step 602, the liver parenchyma may be segmented by employing an edge-based coherent segmentation method. The coherent segmentation method compares the gray scale statistics of the growing region with the gray scale statistics of the pixels at the leading edge of the growing region and expands the region by admitting neighboring regions with similar pixel gray scale statistics. At step 604, the user may change the global threshold to adjust the size of the segmented region. The threshold may correspond to the liver HU value (Hounsfield unit, HU, CT value units). The segmented liver parenchymal contour after each adjustment may be overlaid on an original image to provide a visual reference to the user for possible further adjustment. The user may also make a local adjustment of the segmentation by manually defining the local liver volume for the applicable conditioning parameters, step 606. The automatic calculation of local adjustment parameters may be performed according to different methods, including, but not limited to, shape consistency and the teachings herein. Shape consistency can be measured by calculating the degree of match between the local segmentation and the global segmentation within the overlapping volume. The best match within the overlapping volume may be selected as the best parameter. In the teachings herein, a user may enter partial segmentation. The partial segmentation may be a user-defined liver region or a user-defined liver edge. From the user-drawn region or liver edge, the optimal segmentation parameters can then be calculated. At step 608, the segmented liver may be interactively edited using tools such as cut, fill, and patch. From the liver segmentation, at step 610, different volume measurements may be computed, and the measurements may be at: with or without lesions, with or without blood vessels, or any combination thereof. The volume of the liver region obtained by the editing may be used to calculate the resection part or the remaining part.
FIG. 7(a) illustrates an exemplary embodiment of interactive local adjustment 606 for region segmentation. In step 701, a user first draws a region containing a portion of a liver to be segmented. At step 702, gray level statistics for the drawn region are calculated. In step 703, the optimal parameters for coherence segmentation are calculated. At step 704, the optimal parameters are used for the coherency-based segmentation. At step 705, the user may adjust the coherence segmentation parameters and repeat the above segmentation process as needed.
FIG. 7(b) illustrates another exemplary embodiment of interactive local adjustment 606 for region segmentation. At step 711, the user first draws a partial liver boundary. At step 712, grayscale statistics are computed for regions on both sides of the user-drawn boundary. At step 713, a determination may be made as to which side of the drawn boundary contains a liver by comparing the known liver intensity with the intensity of the two regions (the respective sides of the drawn boundary). In a subsequent step 714, optimal parameters for coherent liver segmentation may be determined. Using the obtained optimal parameters, local liver parenchyma may be segmented, step 704. At step 705, the user may adjust the coherence segmentation parameters and repeat the segmentation process as needed.
Fig. 8 shows an exemplary scenario of vessel analysis 502. Vascular analysis can be performed by a mechanism with three parts: an interactive vessel segmentation mechanism 801, an interactive vessel separation mechanism 802, and an interactive vessel branch labeling mechanism 803.
In some embodiments, if CT images are used, interactive liver vessel segmentation may be performed on images obtained at the portal phase. Fig. 9 shows an exemplary flowchart of the procedure. At step 901, vessel segmentation may begin with an automatically or interactively selected point on a main vessel branch, such as the main portal vein. In step 902, a gray-based adaptive region growing approach may be applied, followed by 3D line filtering in step 903. At step 904, the results from steps 902 and 903 may be further combined to produce a final segmentation result. The process of vessel segmentation will be described in more detail below.
The grayscale-based adaptive region growing segmentation mechanism at step 902: starting with selected vessel seed points (either automatically or manually determined), region growing may be performed according to voxel gray level. The gray threshold for region growing may be adaptively decreased until, for example, the ratio of blood vessels to liver volume exceeds a certain threshold. The segmentation before this ratio is reached can be identified as a vessel.
Line filtering mechanism of step 903: the algorithm can be used to segment small vessels that are not connected to a main vessel branch as obtained by region growing. The line filter is a 3D shaped filter for reinforcing a 3D tubular structure. Where a 3D tubular, e.g. a blood vessel, is present, a high response may be produced at the output of the line filter. Another round of region growing may be applied to the output of 3D line filtering of the volume image so that small vessel branches may be detected.
The combining mechanism of step 904: the two algorithms described above may be applied sequentially so that vessels of different sizes may be segmented. Adaptive region growing may be applied to segment major vessel branches, while 3D line filtering may be used to extract small vessel branches (e.g., vessels with a radius less than a predetermined threshold). In some embodiments, the whole body segmentation algorithm may obtain some user input, e.g., enable adaptive region growing with few seed points. In other embodiments, no user interaction is required in the line filtering segmentation.
The real-time interactive vessel isolation mechanism 802 is used to isolate the portal vein from the hepatic vein. Due to local volume effects, the segmentation results of the two venous systems may be connected to each other. It is necessary to separate these two systems for further analysis, such as vessel branch identification and lobe segmentation. An exemplary flow diagram of the interactive vessel separation mechanism 802 is shown in fig. 10.
In step 1002, starting from the seed point selected in step 901, the automatic tracking finds the voxels at the roots of the portal and hepatic venous systems with the respective maximum vessel thickness in the middle and lower parts of the liver region. The vessel thickness of a voxel is defined as its shortest distance to the vessel boundary. At step 1004, the identified root points may be overlaid on the imagery so that the user may interactively adjust their position at step 1008. In step 1006, the first generation portal and hepatic veins may be automatically identified by growing from the portal and hepatic vein root points until the growth encounters a branch point. The first generation branches of the vascular system are defined as the segments from the root to the branch point of the vessel itself. At step 1010, the first generation segment may be overlaid on the imagery so that the user may interactively adjust the position at step 1014. Portal vein branches not connected to the first generation hepatic vein branch may be automatically marked by tracing a sub-tree down each branch, step 1012. For portal vein branches contacting the hepatic vein due to partial volume effects, at step 1018, the connection path and the break point may be automatically identified as the voxel of maximum obtained by a weighted sum of curvature and relative gray scale along the path. The relative gray scale may be defined as the change in gray scale from the root to the voxel under study. At step 1022, the user may interactively adjust the position of the breaking point. At step 1020, the local VOI around the breakpoint may be analyzed to automatically disconnect based on the tight value measurements of the segmentation angle and the segmentation position. At step 1026, the user may interactively adjust the fracturing result. These steps may be repeated until no further paths can be identified. The vessel separation operation ends at step 1028.
An exemplary embodiment of the fracturing operation of step 1020 is illustrated by fig. 11 through 16. Fig. 11 shows an exemplary VOI around a fracture point. Dashed line 1101 is the vessel boundary and solid line 1102 is the centerline of the vessel. Tracing from the root of the hepatic vein to the root of the portal vein may create a curved dashed line within the vessel, as shown in fig. 12. An entry point 1201 beginning at the portal vein root and an exit point 1202 leading to the hepatic vein root may be identified as the intersection of the path with the VOI. As shown in fig. 13, the traced path 1301 may be compared to a centerline. The closeness measurement of centerline segments 1302 and 1303 to path 1301 may be used to determine that segment 1302 belongs to the portal vein and segment 1303 belongs to the hepatic vein. As shown in fig. 14, to determine the allocation of segment 1403, the angle and endpoint distance between 1403 and 1401 and 1403 and 1402 may be calculated. A closeness measure may be obtained from a weighted sum of the angle and the distance. The minimum value of the measure of closeness values may be selected to decide whether segment 1403 belongs to 1401 or 1402. In fig. 14, segment 1403 gives the minimum closeness measure and it is assigned to the portal vein. Similarly, segment 1404 may be determined to be of hepatic vein origin. The next step is to find a cut plane, dividing all voxels in the VOI into portal and hepatic vein voxels. Such a plane 1503 may be obtained by fitting a plane that maximizes the total distance of the two segments 1501 and 1502 to the plane, as shown in fig. 15.
The user correction of the separation results made in step 1206 can be illustrated by FIG. 16. In a preferred embodiment, the user may click on the segment with the separation error via a computer mouse, and in FIG. 16, assume that the user clicks on segment 1601. After clicking, a new fitting plane is calculated based on the new label of the clicked segment, and the new tangent plane 1602 may be used for new separation of voxels in the VOI.
When the connecting vessels are isolated, the portal and hepatic vein systems may be labeled by the interactive vessel labeling mechanism 803. An exemplary flow chart is shown in fig. 17. At step 1702, a centerline of a separate portal or hepatic venous system may be extracted. From the extracted centerlines, a portal or hepatic vein system map may be constructed at step 1704. The graph may include edges and vertices. Each vessel segment may be represented by an edge. A branch point is a vertex. Fig. 18 shows the example diagram. Different segments of the blood vessel may be marked with different colors. At step 1706, the user may interactively label the vessel branch. In an exemplary embodiment, the user may click on a point adjacent to the root of the branch to initiate or activate the automatic marking. Other interactive operations may also be provided, for example, existing tags may be removed by defining a mouse click on the corresponding tag segment. The vessel branch labeling algorithm may be applicable to portal and hepatic vein systems.
To perform interactive labeling of the vessel after vessel separation, the user may click on the root of the branch. The automatic tracking may follow a path from the click point to all the leaves of the tree view. The tracked voxels may be assigned the same label. Fig. 19 shows an example of a labeled portal vein system. The user may click on the root section segment 1901 to mark the branch. Subsequent vessel segments starting at the click point can be automatically identified by tracking in the vessel map. Also, the labeling may be performed by: visualizing the vascular system on a display; selecting a first vessel segment of a vessel branch to be labeled; identifying any vessel segments belonging to the vessel branch other than the first vessel segment, from the graphical representation of the vessel system; the first vessel segment and any vessel segment are labeled with a marker.
Although the present invention has been described with reference to certain illustrated embodiments, the words that have been used herein are words of description rather than words of limitation. Modifications may be made within the scope of the appended claims without departing from the scope and spirit of the invention. Although the invention has been described herein with reference to particular structure, acts, and materials, the invention is not to be limited to the particulars disclosed, but rather is to be construed in a wide variety of forms, some of which may be quite different from those of the disclosed embodiments, and extends to all equivalent structures, acts, and materials.
Claims (33)
1. A method of labeling a vessel branch comprising the steps of:
obtaining a vascular system having a plurality of vascular branches;
extracting a central line of each blood vessel branch;
constructing a graphical representation of the vessel system from the extracted centerlines;
dividing the vascular system into a portal sub-vascular system and a hepatic sub-vascular system according to the graphical representation by automatically identifying a plurality of root points for each subsystem and automatically determining at least one breakpoint between the root points on the graph based on the one or more characteristics associated with each centerline;
each portal and hepatic vein sub-vasculature is labeled in the vasculature according to the graphical representation.
2. The method of claim 1, wherein the vascular system is one of the portal venous vascular system and the hepatic venous vascular system.
3. The method of claim 1, wherein the graphical representation comprises at least one edge and at least one vertex, wherein the edge in the graphical representation represents a vessel segment and each vertex represents where two vessel segments are connected in sequence.
4. The method of claim 3, wherein the vessel segment is a portion of a vessel branch.
5. The method of claim 1, wherein the marking step comprises:
presenting each of the sub-vasculature on a display;
selecting a first vessel segment of a vessel branch to be labeled;
identifying any vessel segments belonging to the vessel branch other than the first vessel segment, from the graphical representation of the vessel system;
the first vessel segment and any vessel segment are labeled with a marker.
6. The method of claim 5, wherein the selecting step is performed by a mouse.
7. The method of claim 6, wherein the first vessel segment is selected by clicking a mouse at a location on the display where the first vessel segment is rendered.
8. The method of claim 5, wherein the first vessel segment corresponds to one of a plurality of vessel segments along the vessel branch, and the size of the first vessel segment is not smaller than the size of any other vessel segment of the vessel branch.
9. The method of claim 5, wherein the step of labeling with a marker comprises reproducing the vessel branch in a display scheme determined from the marker.
10. The method of claim 1, wherein the step of obtaining a vascular system comprises:
the blood vessel is segmented in the image data related to the blood vessel.
11. The method of claim 10, wherein the step of segmenting comprises the steps of:
selecting an initial starting point in the image data;
performing region growing according to the initial starting point to obtain a first group of alternative blood vessels;
performing line filtering on the image data to obtain a second group of alternative blood vessels; and
the first set of candidate vessels and the second set of candidate vessels are merged to produce a segmented vessel.
12. The method of claim 1, wherein the separating step comprises:
determining a first root point and a second root point according to the graphical representation, the first root point and the second root point corresponding to the first vessel segment and the second vessel segment respectively;
tracing from the first root point to the second root point along the first vessel segment and the second vessel segment to obtain a traced path;
determining a break point on the tracking path having a curvature measure satisfying a first condition and/or a grayscale measure satisfying a second condition, wherein the break point divides the tracking path into a first portion and a second portion;
identifying a first root edge and a second root edge in the graphical representation, corresponding to the first portion and the second portion of the traced path, respectively;
selecting one of the first and second root edges as a connecting edge and the unselected root edges as a non-connecting edge;
sequentially connecting the connecting edge with the edge in the graphical representation according to a connecting standard to generate an updated connecting edge; and
repeating the connecting steps according to the updated connecting edges to form the vascular system.
13. The method of claim 12, wherein the first condition is that the curvature measurement at the break point exceeds a threshold and/or is not less than the curvature measurement at another point along the traced path.
14. The method of claim 12, wherein the second condition is that the change in the grayscale measurement at the break point exceeds a threshold.
15. The method of claim 12, further comprising interactively adjusting the separation result.
16. The method of claim 12, wherein
The first root edge is the edge closest to the first portion in the graphical representation and the distance to the first portion is less than a first threshold; and
the second root edge is the edge closest to the second portion in the graphical representation and the distance to the second portion is less than a second threshold.
17. The method of claim 12, wherein the connection criterion is that a closest distance between the unconnected edge and the connected edge is less than a closest distance between the unconnected edge and the unconnected edge.
18. The method of claim 17, wherein the repeating step continues until a connection criterion cannot be met.
19. A method of obtaining a vascular system comprising the steps of:
segmenting blood vessels in image data associated with the blood vessels; and
dividing the segmented vessels into a plurality of groups, each group corresponding to a vessel system having a plurality of vessel branches, wherein the separating step comprises:
determining a first root point and a second root point according to the graphical representation, the first root point and the second root point corresponding to the first vessel segment and the second vessel segment respectively;
tracing from the first root point to the second root point along the first vessel segment and the second vessel segment to obtain a traced path;
determining a break point on the traced path having a curvature measure satisfying a first condition and/or a grayscale measure satisfying a second condition, wherein the break point divides the traced path into a first portion and a second portion;
identifying a first root edge and a second root edge in the graphical representation, corresponding to the first portion and the second portion of the traced path, respectively;
selecting one of the first and second root edges as a connecting edge and using the unselected root edge as a non-connecting edge;
sequentially connecting the connecting edge with the edge in the graphical representation according to a connecting standard to generate an updated connecting edge; and
repeating the connecting steps according to the updated connecting edges to form the vascular system.
20. The method of claim 19, wherein the step of segmenting comprises the steps of:
selecting an initial starting point in the image data;
performing region growing according to the initial starting point to identify a first set of candidate vessels;
performing line filtering in the image data to identify a second set of candidate vessels; and
the first set of candidate vessels is merged with the second set of candidate vessels to produce a segmented vessel.
21. The method of claim 19, wherein the first condition is that the curvature measurement at the break point exceeds a threshold and/or is not less than the curvature measurement at another point along the traced path.
22. The method of claim 19, wherein the second condition is that the change in the grayscale measurement at the break point exceeds a threshold.
23. The method of claim 19, further comprising the step of interactively adjusting the separation result.
24. The method of claim 19, wherein
The first root edge is the edge of the graphical representation that is closest to the first portion and the distance to the first portion is less than a first threshold; and
the second root edge is the edge of the graphical representation that is closest to the second portion and the distance to the second portion is less than a second threshold.
25. The method of claim 19, wherein the connection criterion is that a closest distance between the unconnected edge and the connected edge is less than a closest distance between the unconnected edge and the unconnected edge.
26. The method of claim 19, wherein the repeating step continues until the connection criteria cannot be met.
27. The method of claim 19, further comprising the step of marking vessel branches in the vessel system based on the graphical representation constructed for the vessel system.
28. The method of claim 27, wherein the graphical representation is constructed by a process comprising the steps of:
extracting a central line of each vessel branch in the vascular system;
a graphical representation is constructed from the extracted centerlines.
29. The method of claim 27, wherein the marking step comprises:
visualizing the vascular system on a display;
selecting a first vessel segment of a vessel branch to be labeled;
identifying any vessel segment belonging to the vessel branch other than the first vessel segment, based on the graphical representation of the vessel system;
the first vessel segment and any vessel segment are labeled with a marker.
30. The method of claim 29, wherein the step of selecting is performed by a mouse.
31. The method of claim 30, wherein the first vessel segment is selected by clicking a mouse at a location on the display where the first vessel segment is rendered.
32. The method of claim 29, wherein the first vessel segment corresponds to one of a plurality of vessel segments along the vessel branch, and the first vessel segment has a size that is not smaller than a size of any other vessel segment of the vessel branch.
33. The method of claim 29, wherein the step of labeling with the marker includes visualizing the vessel branch according to a display scheme associated with the marker.
Applications Claiming Priority (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US69387105P | 2005-06-24 | 2005-06-24 | |
| US60/693,871 | 2005-06-24 | ||
| PCT/US2006/024753 WO2007002562A2 (en) | 2005-06-24 | 2006-06-26 | Methods for interactive liver disease diagnosis |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| HK1121843A1 true HK1121843A1 (en) | 2009-04-30 |
| HK1121843B HK1121843B (en) | 2014-07-18 |
Family
ID=
Also Published As
| Publication number | Publication date |
|---|---|
| EP1894164A2 (en) | 2008-03-05 |
| WO2007002562A3 (en) | 2007-09-20 |
| CN101203890A (en) | 2008-06-18 |
| WO2007002562A2 (en) | 2007-01-04 |
| CN101203890B (en) | 2014-04-09 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US12051497B2 (en) | Systems and methods for validating and correcting automated medical image annotations | |
| CN101203890B (en) | Interactive Liver Disease Diagnosis Method | |
| EP2157905B1 (en) | A method for tracking 3d anatomical and pathological changes in tubular-shaped anatomical structures | |
| CN105912874B (en) | Liver three-dimensional database system constructed based on DICOM medical image | |
| JP6877868B2 (en) | Image processing equipment, image processing method and image processing program | |
| CN102781333B (en) | Image diagnosis support device and method | |
| EP2104453B1 (en) | Visualizing a vascular structure | |
| US20030053697A1 (en) | Systems and methods for tubular object processing | |
| US20070276214A1 (en) | Systems and Methods for Automated Segmentation, Visualization and Analysis of Medical Images | |
| US11132801B2 (en) | Segmentation of three-dimensional images containing anatomic structures | |
| US7783091B2 (en) | Method for segmenting anatomical structures from 3D image data by using topological information | |
| EP2878278B1 (en) | Surgical operation support device, method and program | |
| US8285017B2 (en) | Methods for interactive labeling of tubular structures in medical imaging | |
| EP2116977A2 (en) | Method for editing 3D image segmentation maps | |
| EP3855392A1 (en) | System and method for linking a segmentation graph to volumetric data | |
| US9984456B2 (en) | Method and system for labeling hepatic vascular structure in interactive liver disease diagnosis | |
| US20060182341A1 (en) | Method for automatically determining the position and orientation of the left ventricle in 3D image data records of the heart | |
| HK1121843B (en) | Methods for interactive liver disease diagnosis | |
| Li et al. | Segmentation and tracking of coronary artery using graph-cut in CT angiographic | |
| Peitgen et al. | Image Processing and Imaging for Operation Planning in Liver Surgery |