[go: up one dir, main page]

WO2013012492A2 - Système d'analyse de sinus aérien et de cavité nasale - Google Patents

Système d'analyse de sinus aérien et de cavité nasale Download PDF

Info

Publication number
WO2013012492A2
WO2013012492A2 PCT/US2012/042005 US2012042005W WO2013012492A2 WO 2013012492 A2 WO2013012492 A2 WO 2013012492A2 US 2012042005 W US2012042005 W US 2012042005W WO 2013012492 A2 WO2013012492 A2 WO 2013012492A2
Authority
WO
WIPO (PCT)
Prior art keywords
airways
displaying
paranasal sinus
sinus region
volume image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/US2012/042005
Other languages
English (en)
Other versions
WO2013012492A3 (fr
Inventor
Jay S. SCHILDKRAUT
Lawrence A. Ray
Krishnamoorthy Subramanyan
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Carestream Health Inc
Original Assignee
Carestream Health Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Carestream Health Inc filed Critical Carestream Health Inc
Priority to US14/131,694 priority Critical patent/US9974503B2/en
Publication of WO2013012492A2 publication Critical patent/WO2013012492A2/fr
Publication of WO2013012492A3 publication Critical patent/WO2013012492A3/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/46Arrangements for interfacing with the operator or the patient
    • A61B6/461Displaying means of special interest
    • A61B6/463Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/02Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/03Computed tomography [CT]
    • A61B6/032Transmission computed tomography [CT]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/40Arrangements for generating radiation specially adapted for radiation diagnosis
    • A61B6/4064Arrangements for generating radiation specially adapted for radiation diagnosis specially adapted for producing a particular type of beam
    • A61B6/4085Cone-beams
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/46Arrangements for interfacing with the operator or the patient
    • A61B6/461Displaying means of special interest
    • A61B6/466Displaying means of special interest adapted to display 3D data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/46Arrangements for interfacing with the operator or the patient
    • A61B6/467Arrangements for interfacing with the operator or the patient characterised by special input means
    • A61B6/469Arrangements for interfacing with the operator or the patient characterised by special input means for selecting a region of interest [ROI]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/50Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications
    • A61B6/501Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications for diagnosis of the head, e.g. neuroimaging or craniography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5211Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
    • A61B6/5217Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data extracting a diagnostic or physiological parameter from medical diagnostic data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5211Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
    • A61B6/5223Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data generating planar views from image data, e.g. extracting a coronal view from a 3D image
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/30ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/50Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications
    • A61B6/506Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications for diagnosis of nerves
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/04Indexing scheme for image data processing or generation, in general involving 3D image data
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10081Computed x-ray tomography [CT]
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20036Morphological image processing
    • G06T2207/20044Skeletonization; Medial axis transform
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30172Centreline of tubular or elongated structure
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/41Medical
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/20Indexing scheme for editing of 3D models
    • G06T2219/2012Colour editing, changing, or manipulating; Use of colour codes

Definitions

  • the invention relates generally to the field medical imaging systems and in particular to systems that process and display images of the paranasal sinus region, including the nasal cavities.
  • the paranasal sinuses are air-filled regions within bones of the skull.
  • the frontal, ethmoid, maxilla, and sphenoid bones contain sinuses that are named for the bone structure that immediately surrounds them.
  • the nasal region contains smaller bones including the inferior turbinate, lacrimal, palatine, and nasal bones. The spaces between these articulated bones are sometimes air filled.
  • the bones of the sinus and nasal cavities are covered by a mucosa lining, producing mucus that drains into the nasal cavity and
  • a sinus cell is connected to the nasal cavity by a narrow opening called the ostium.
  • the arrangement of sinus cells and bones forms paths for drainage. When drainage is prevented by constriction or blockage of the ostia and/or paths, sinusitis, infection or inflammation in the paranasal sinus region, can result.
  • the condition of the paranasal sinuses and nasal cavity can be assessed using an X-ray computerized tomographic (CT) image of a patient's head, at the level of the sinuses.
  • CT computerized tomographic
  • Image acquisition for this purpose is generally performed with a fan-beam computed tomographic system (FBCT) or a conebeam computed tomographic (CBCT) system.
  • FBCT fan-beam computed tomographic system
  • CBCT conebeam computed tomographic
  • the anatomy of the paranasal sinuses is very complicated, making it difficult to interpret images of the paranasal region and to detect problem conditions.
  • the anatomy of the sinuses and bones of the nasal region can vary significantly from patient to patient.
  • pathological conditions can greatly alter the anatomy of the sinuses and nasal cavity. For these reasons, it can be difficult even for a skilled practitioner to determine the condition and drainage paths for each sinus by examination of the axial, coronal, sagittal, and other views of the
  • a related object of the present invention is to assess and display drainage paths through the sinuses and to indicate areas of potential blockage, relatively high curvature, or other problem.
  • a method for displaying a paranasal sinus region of a patient executed at least in part on a computer and comprising: acquiring volume image data of the paranasal sinus region of the patient; identifying one or more airways within the paranasal sinus region from the volume image data; displaying the at least one or more airways; and highlighting one or more portions of the displayed one or more airways that are constricted below a predetermined value.
  • FIG. 1 is a logic flow diagram that shows a process for detection and display of drainage in paranasal and nasal cavity analysis.
  • FIG. 2 is a view of a paranasal and nasal cavity volume showing different structures.
  • FIG. 3 is a logic flow diagram that shows adaptive segmentation processing according to an embodiment of the present invention.
  • FIG. 4 is a logic flow diagram that shows optional user modification for drainage analysis processing.
  • FIG. 5 is a logic flow diagram showing alternate skeletonization processing for airway detection and display.
  • FIG. 6 is a logic flow diagram for specifying and displaying an airway path according to an embodiment of the present invention.
  • FIG. 7 shows a segmented airway with a highlighted path.
  • FIG. 8 shows a user interface window for showing cross-sectional path information.
  • FIG. 9 is a logic flow diagram showing how the cross-sectional area of the airway path can be computed according to an embodiment of the present invention.
  • FIG. 10 is a logic flow diagram showing how atlas registration is used for labeling of airway portions.
  • FIG. 11 shows anatomy of interest displayed to the user as a graphical representation of anatomical structures.
  • FIG. 12 is a logic flow diagram that shows how a table of sinus cells is developed and used according to an embodiment of the present invention.
  • FIG. 13 is a schematic block diagram that shows a processor apparatus for performing the processing and display functions of the present invention.
  • FIG. 13 shows one type of computer apparatus for performing the processing of the present invention.
  • FIGS. 14A and 14B show examples of a user interface provided for entry of user instructions.
  • FIG. 14C shows examples of a user interface provided for auto locate and zoom of an anatomical feature that is indicated by the user.
  • FIG. 15 shows a view of a virtual endoscopic interface according to an embodiment of the present invention.
  • the paranasal sinus region includes the nasal cavity, paranasal sinuses, and all or part of associated bones including the frontal, maxilla, ethmoid, sphenoid, inferior turbinate, middle turbinate, superior turbinate, lacrimal, nasal, and palatine bones.
  • the term "airways" includes the mouth, throat, nasal cavity, maxillary, ethmoid, sphenoid, frontal paranasal sinuses, and other air- filled regions that are internal to the head.
  • image refers to multi-dimensional image data that is composed of discrete image elements.
  • the discrete image elements are picture elements, or pixels.
  • the discrete image elements are volume image elements, or voxels.
  • code value refers to the value that is associated with each volume image data element or voxel in the reconstructed 3-D volume image.
  • code values for CT images are often, but not always, expressed in Hounsfield units.
  • highlighting for a displayed feature has its conventional meaning as is understood to those skilled in the information and image display arts. In general, highlighting uses some form of localized display enhancement to attract the attention of the viewer. Highlighting a portion of an image, such as an individual organ, bone, or structure, or a path from one chamber to the next, for example, can be achieved in any of a number of ways, including, but not limited to, annotating, displaying a nearby or overlaying symbol, outlining or tracing, display in a different color or at a markedly different intensity or gray scale value than other image or information content, blinking or animation of a portion of a display, or display at higher sharpness or contrast.
  • a CT image of the paranasal sinus region is acquired.
  • This image data may be acquired with a fan-beam or cone beam CT scanner.
  • the CT scan is acquired with a cone beam CT (CBCT) scanner.
  • CBCT cone beam CT
  • a 3D image of the sinus region is reconstructed with isotropic voxels.
  • the airways in the CT image are segmented. In the segmentation process, consistent with one embodiment of the present invention, each voxel in the 3D tomographic image data is assigned to a class.
  • the assignment of a class to a voxel is, at least in part, dependent on the code value of one or more neighboring or adjacent voxels in the volume image data. In an alternate embodiment, the assignment of a class to a voxel is, at least in part, dependent on the class assignment of one or more neighboring or adjacent voxels in the volume image data.
  • possible classes include air, water, bone, muscle, adipose, and other types of tissue or implant material. In airways segmentation, voxels in the 3D image that correspond to air that is internal to the head have the air class assigned to them.
  • airway segmentation can be performed using any of a number of suitable image segmentation methods known to those skilled in the image processing arts.
  • suitable image segmentation methods include, but are not limited to, K-means clustering, adaptive K-means clustering, region growing, fast- marching, level set, thresholding, graph partitioning, model based, compression based, edge-based, classifier, and watershed segmentation methods, or
  • the user specifies criteria for drainage paths between regions of the airways. For example, the user may specify that the minimum effective path diameter between regions must be greater than 1.0 mm. The user may alternately specify that as long as there is any connection between airway regions, no matter how small, a drainage path exists. There may also be default criteria for displaying areas of constriction used in step 103, obviating the need for user-entered criteria prior to display of the sinus regions.
  • an analysis step 104 the segmented airways are analyzed to determine regions for which there are drainage paths that satisfy the criteria of step 103.
  • a region of the airways for which drainage is impeded based on the criteria of step 103 is displayed to the user, such as in a highlight color different from other parts of segmented airways. If the display device does not support color, or if desired for other reasons, different shades of gray or texture patterns can alternately be used to highlight or distinguish a region with impeded drainage. One or more symbols or outlining can alternately be used to indicate constriction.
  • the system may then return to step 103 in which the drainage criteria are modified, which may result in a change in the color or other presentation of regions that are displayed to the user.
  • the display of FIG. 2 shows, by way of example, the result of applying the method of FIG. 1 to a CT image of the sinus region.
  • the user is informed by the system of the present invention of impeded drainage for two sinus cells according to the drainage criteria of step 103 in FIG. 1.
  • a main region 200 includes the main part of the airways, which includes the throat, nasal cavity, and most of the nasal sinuses.
  • a right sphenoid sinus 202 is shown in a different shade of gray, thereby indicating that its drainage to other parts of the airways is impeded.
  • a cell 204 of the left ethmoid sinus is also highlighted, such as by being displayed in another color or shade of gray, again indicating impeded drainage to other parts of the airways.
  • the system informs the user that mucus drainage from sinus cells to the nasal cavity and throat is prevented and shows the approximate location of the blockage. This can assist to diagnose sinus problems including sinusitis.
  • FIG. 3 shows an adaptive segmentation step 102 that is used according to an embodiment of the present invention. This method is based on "An
  • Adaptive Clustering Algorithm for Image Segmentation by T. N. Pappas in IEEE Transactions on Signal Processing, Vol. 40, 1992.
  • a segmentation step 300 of FIG. 3 global K-means segmentation is performed on the CT image.
  • the mean code value for each cluster is global, which means that it does not vary with location within the image.
  • the number of clusters (value of K) is set to 4, although other values could alternately be used.
  • the result of step 300 is a rough segmentation of the image into air, low density soft tissue, high density soft tissue, and bone clusters.
  • Low density soft tissue is mostly comprised of adipose and high density tissue of muscle.
  • a window selection step 302 sets a spatial window size. For example, a 13x13x13 voxel window is selected.
  • the local mean code value is calculated at each voxel in the CT image within the window that is centered at the voxel.
  • the local cluster mean can then be modified by setting it equal to a weighted average of the local and global cluster mean in order to prevent the local mean from deviating excessively from its global value.
  • a voxel assignment step 306 each voxel in the image is assigned to a cluster. This assignment is partially based on the difference between the voxel's code value and the local mean of each cluster.
  • voxel cluster assignment is also based on the cluster assigned to neighboring voxels, in order to impose a degree of spatial smoothness on the segmented airways.
  • Steps 304, 306, and 308 repeat until the assignment of voxels to a cluster is unchanged or cluster assignment has changed for only a small number of voxels.
  • the window size is reduced and the method returns to step 304.
  • the segmentation process ends when the window size is less than a set minimum window size, as determined in a decision step 312.
  • the final result of the method in FIG. 3 is a cluster map in which each voxel in the CT image is assigned the number of the cluster to which it belongs. Since each cluster is associated with a class of material, the cluster map is also a type of class map in which each voxel of the image has a class assigned to it. The cluster with the lowest mean code value is then the cluster that corresponds to the air class. When all voxels of the cluster map that belong to the air class are set to 1 and all other voxels are set to 0 the result is an air
  • segmentation map of both external ambient air and airways that are internal to the head.
  • the external air is removed from the segmentation map by creating a tissue map that is comprised of non-air class voxels. Morphological closing and flood- fill operations are then used to close holes in the tissue map. The intersection of this modified tissue map and air segmentation map is the final map of the segmented airways.
  • the logic flow diagram of FIG. 4 shows an added user input step 405 used in an alternate embodiment of the present invention, allowing the user to provide input that improves segmentation, including edits to the class map, for example.
  • the user provides instructions that modify the display of anatomy of the nasal region in some way. Modifications can be indicated interactively by viewer instructions entered with reference to a displayed rendering of the CT slices in a coronal, axial, sagittal, or other view.
  • User input instructions can be entered using a pointer device, such as a mouse or joystick, for example, or using touch screen input.
  • the user may interact with the system using a 3D rendering of the nasal region.
  • the user may enter instructions that indicate that an ostium of the left maxillary sinus is blocked.
  • step 405 of FIG. 4 Upon receiving user instructions in step 405 of FIG. 4, the system returns to step 104 to re-calculate and re-determine regions between which drainage paths exist.
  • processing returns to step 103 in which the drainage path constriction criteria for detection and display are modified.
  • the left maxillary sinus may initially display in the same color as the nasal cavity.
  • step 405 if the user indicates constriction or blockage of the maxillary ostium, the color of the maxillary sinus may change. Otherwise, the existence of auxiliary ostia is indicated.
  • the user may also indicate other modifications including, but not limited to, the removal of sinus cells, bone, and other tissue from display. This may be helpful, for example, to allow better visibility of obscured features along the airway path.
  • the logic flow diagram of FIG. 5 shows processing according to an alternate embodiment of the present invention, using skeletonization.
  • a skeletonization step 504 the segmented airways from step 102 are skeletonized. This presents a display that more clearly identifies the airway path and
  • skeletonization of a 3D object is defined using the loci of centers of maximally inscribed spheres.
  • the skeleton is comprised of medial lines that outline the shape of the object.
  • Skeletonization methods that preserve the topology of the object may also contain pockets that enclose internal cavities.
  • the methods used for skeletonization may utilize any of the techniques that are known in the art for isolating well-defined uniform structures from other 3-D image content, including boundary peeling, thinning, erosion, and distance transform, for example.
  • the cross-sectional area of the airway is calculated along the skeleton.
  • This cross-sectional area may be reported to the user in various ways, such as in terms of area, effective radius, effective diameter, or using any other metric that expresses the cross-sectional area.
  • a display step 508 paths within the airways are highlighted, displayed to the user with important locations indicated.
  • Important locations can include, but are not limited to, locations of global or local minimum cross-sectional area that may occur at sinus ostia or locations at which a drainage path is restricted.
  • the system displays a virtual endoscopic view at important path locations.
  • a selection step 602 the user selects two or more points in the image.
  • the user instruction for this purpose may select the points using interaction with the CT slices or with a 3D rendering of the nasal region.
  • the user may also select the points from a list of anatomical features. For example, the user may indicate a point in the throat and another point in the right frontal sinus.
  • a path determination step 604 the system determines a path between the points and displays the path to the user. In some cases, the system may also inform the user that no path could be found.
  • a path-finding algorithm such as Dijkstra's algorithm, well known to those skilled in the image processing arts, is used for path determination according to one embodiment of the present invention.
  • the path finding algorithm can determine a lowest cost path based on several definitions of cost. For example, the cost of a path may be based on its length, on its minimum cross-sectional area, on its average cross-section area, on a combination of cross-sectional area and length, or on other path properties. Of special relevance are properties that relate to the flow and drainage of mucus.
  • the system displays the path to the user by highlighting or annotating a 3D rendering of the CT data and/or by highlighting or annotation of the CT slices.
  • a display step 608 characteristics of the path such as the effective diameter at each point along the path are displayed to the user in graphical or tabular form.
  • the system also displays to the user a virtual endoscopic view at one or more locations on the path as indicated by a display step 610. Examples and more detailed description of a virtual endoscopic view are given subsequently.
  • FIG. 7 illustrates displayed results of the process outlined in FIG. 6.
  • a segmented airway 700 displays, showing a calculated path 702 between identified points in the right frontal sinus and throat.
  • a symbol 704 is located at the position along the skeleton at which the cross-sectional area of the path is a minimum.
  • numerical values display to convey additional information.
  • the effective diameter at the location of the symbol 704 is provided by a numerical annotation 706.
  • embodiments of the present invention also analyze path 702 for locations of high curvature. This indicates where endoscopy would be blocked or difficult, such as near a symbol 720 in FIG. 7.
  • a user interface window 710 in FIG. 8 shows an example of path cross-sectional area information shown in a graphical fashion for an automatically determined path between a point in a frontal sinus and the throat.
  • the decrease in effective path diameter (Deff) at locations in the frontal recess is indicated.
  • Curvature can be expressed in terms of radius and arc length, for example, or can be expressed in terms of a curvature index or other computed value.
  • FIG. 9 shows a method that is used by the system to determine the cross-sectional area of an airway path.
  • a location identification step 900 a location 902 on a path in the airways is identified. In practice, this location may be within the skeleton, such as between two locations where drainage between the locations is of interest.
  • a direction determination step 904 the direction of the path is determined at the location 902 along the airway path.
  • a plane determination step 906 the plane that is perpendicular to the path direction from step 904 at path location 902 is determined.
  • intersection determination step 908 the intersection of the segmented airways and this plane is determined. This intersection may consist of one or more disconnected regions.
  • the connected region in the plane that contains the path point 902 is the part of the segmented airways that, in a cross-sectional area definition step 910, defines the cross- sectional area of the airway at location 902.
  • This connected region is determined by tracing rays that are confined to plane 906 from location 902. Each ray is traced until it exits the segmented airways.
  • the length of the ray is the radius of the airway in the ray's direction.
  • the cross-sectional area of the airways at path location 902 is calculated by tracing rays in a 360 degree circle.
  • the steps in FIG. 9 repeat as needed to determine the cross-sectional area along the path. This also provides data for estimating the volume of the corresponding path.
  • the logic flow diagram of FIG. 10 shows another aspect of an embodiment of the present invention, using comparison between the airways volume image and an atlas or other prototype.
  • a registration step 1006 an airways atlas or other prototype 1002, is registered with the segmented airways of the patient from step 102.
  • the prototype that is used is an atlas that has been generated as a segmented airway for which the location of significant anatomy has been labeled.
  • An atlas is a 3-D model typically generated from statistical data obtained using a number of samples. Labeling is generally performed manually by a person who has thorough knowledge of the anatomy of the paranasal sinus region.
  • the atlas is transformed to match the patient's airways so that the labels in the atlas can be transferred to the patient's airways. As a result, the anatomy of the patient' s airways is determined.
  • Registration step 1006 in FIG. 10 may include both rigid and non- rigid registration.
  • Rigid registration uses translation and rotation to match the atlas or other prototype and segmented airways. Scaling is also added to translation and rotation because the prototype and patient' s airways may be of different size. When scaling and shears are added in addition to translation and rotation, the registration is termed affine registration. Rigid or affine registration is not always sufficient to match an atlas with the patient' s airways because there can be considerable variation in airways anatomy. For this reason, step 1006 also includes non-rigid registration utilities. In non-rigid registration, the atlas or other prototype may be deformed when it is mapped to the patient's airways.
  • Non-rigid registration usually involves determining corresponding points between the atlas and patient' s airways and calculating an optimal mapping that transforms a set of atlas points to their respective corresponding points in the patient's airways. Generally, the mapping must satisfy continuity conditions and other constraints that prevent a perfect mapping of the prototype's points to their corresponding points in the patient's airways.
  • Step 1006 may use any non-rigid registration method known in the art. Non-rigid registration using splines, correlation, Fourier, mutual information, relaxation, elastic, and wavelet methods can be used for this function.
  • a labeling step 1008 the patient's airways are labeled by transfer of labels from the registered atlas. After this step, the patient's airways with labeled anatomy can be displayed in a display step 1010.
  • step 1009 of FIG. 10 additional capabilities of the system are illustrated in a step 1009 of FIG. 10 in which the user indicates anatomy of interest. This could be accomplished in many ways, such as by selection from a list, by audible prompt, or by pointing to locations on displayed image slices or on a 3D rendering of the image.
  • the system displays the anatomy of interest to the user.
  • the anatomy of interest may display as a close-up or zoom-in of a 3D rendering and/or CT slices, a virtual endoscopic view, or a graphical representation.
  • the system also displays characteristics of the anatomy of interest including, but not limited to, the volume of air and the natural volume as determined by the bony boundary of the anatomy.
  • anatomy of interest examples include, but are not limited to, the location of sphenoid, frontal, maxillary, and ethmoid sinus cells.
  • the system automatically locates, marks the location of, and provides zoomed views of the infraobital nerve, ethmoidal artery, orbital fissure, optic nerve, vidian canal, and carotid arteries, and other nerves, arteries, and critical anatomy.
  • a related use of the atlas or other prototype helps to identify anatomical irregularities or anomalies for a particular patient. This can be detected for example, when methods for fitting the anatomy to the atlas fail to provide a match to within predetermined thresholds or when a comparison of atlas fitting is made among atlases with different anatomical variations.
  • the system is able to determine features such as septal deviations which result in impaired sinus drainage.
  • the system is also able to indicate the presence and location of agger nasi cells, supraorbital cells, and type 1-4 frontal cells which affect mucus drainage and need to be considered during sinus surgery.
  • FIG. 11 shows anatomy of interest and anatomical variations displayed to the user as a graphical representation of anatomical structures.
  • the anatomy of interest is the frontal sinus drainage path 1100.
  • the graphic representation shows that drainage occurs around the agger nasi cell 1104 and frontal cell of the ethmoid sinus 1102. This type of symbolic representation can be helpful for providing summary details to a practitioner in more generalized form.
  • a table generation step 1210 the system produces a table of sinus cells that includes information on cell properties. Properties include, but are not limited to, location of the cell, the bone in which the cell resides, and cell volume.
  • the cell volume may include the air volume of the cell.
  • the cell volume may also include the "natural" volume of the cell as determined by bone boundaries. If the air volume of the cell differs from its natural volume, the presence of mucous, polyps, or an infection is indicated.
  • a selection step 1212 the user selects one or more cells from the table of sinus cells.
  • a display step 1214 the system displays the selected sinus cells to the user.
  • the cells may be displayed as a 3D rendering, CT slices, or as a virtual endoscopic view. Cells not selected can be invisible or shown in outline, for example.
  • the term "computer” can be broadly interpreted to describe any type of computer, computer system, or logic processor hardware that is capable of executing programmed instructions for the image processing and display functions described herein. Unless specifically stated otherwise as apparent from the preceding discussion, it is appreciated that throughout the description, discussions utilizing terms such as “processing” or “computing” or “calculating” or “determining” or “displaying” or “determining” or the like, refer to the action and processes of a computer system, or workstation or similar electronic computing device that manipulates and transforms data represented as physical (electronic) quantities within the computer system memories or registers or other such information storage, transmission or display devices.
  • Certain aspects of the present invention include process steps and instructions described herein in the form of algorithms or image processing utilities. It should be noted that the process steps and instructions of the present invention could be embodied in software, firmware or hardware, and when embodied in software, could be downloaded to reside on and be operated from different computer processing platforms used by a variety of operating systems.
  • a system for displaying paranasal and sinus features of a patient comprising: a display and a computer processor for executing executable computer program code; a computer-accessible storage medium containing the executable computer program instructions for performing a method comprising acquiring a volume image of a paranasal region of the patient; identifying one or more airways within the paranasal region; displaying at least the one or more airways on the display; and, highlighting portions of the displayed one or more airways that are constricted below a predetermined value.
  • the present invention also relates to an apparatus for performing the operations herein.
  • This apparatus may be a processor specially constructed for the required purposes, or it may comprise a general-purpose computer selectively activated or reconfigured by a computer program stored in the computer.
  • a computer program may be stored in a computer readable storage medium, such as, but not limited to, any type of disk including magnetic disks, optical disks, CD-ROMs, magnetic-optical disks, read-only memories
  • ROMs read-only memory
  • RAMs random access memories
  • EPROMs erasable programmable read-only memory
  • EEPROMs electrically erasable programmable read-only memory
  • magnetic or optical cards magnetic or optical cards
  • ASICs application specific integrated circuits
  • the computer(s) referred to in the specification may include a single processor or may use architectures that employ multiple processors for increased speed and computing capability.
  • FIG. 13 shows a processor apparatus for performing the processing and display functions of the present invention.
  • An imaging apparatus 30 acquires the CT or other volume image of the paranasal sinus region, including nasal cavity areas.
  • a host computer 32 typically connected along a network 34 obtains the acquired volume image for processing and display on a display monitor 40.
  • Host computer 32 stores the received image data in a computer- accessible electronic memory 62 for subsequent delay and processing.
  • a variety of user interface options is available at display monitor 40, enabling the viewer to more readily identify features of the paranasal sinus region and related sinus cavities and to obtain information related to path blockage, connectedness, and relative diameter at any point.
  • user interface options for controlling how passages and other features are displayed are the following: (i) select colors or other presentation aspects such as gray scale or intensity for the airway path as well as for highlighting identified features that form the paranasal sinus region and related sinus cavities;
  • select colors for indicating path blockage or other conditions such as selecting a green color to indicate full blockage, a yellow color to indicate a point in the path with diameter less than 1 mm, and so on;
  • (ix) adjustment capability such as using a slider or control knob to adjust threshold values for display in different colors.
  • an optional autozoom capability is provided to help the practitioner to readily display a portion of the sinus anatomy at a higher magnification.
  • Selection of particular anatomy for autozoom display can be performed using a menu that lists portions or cells of the sinus anatomy.
  • FIG 14C shows a user interface for automatically locating and zooming to selected anatomy.
  • a 3D rendering of the airways is displayed as a view 53 to the user.
  • the user selects from list 51 of anatomical features.
  • the left maxillary sinus is selected.
  • the system displays an axial 57, coronal 55, and sagittal 59 zoomed view of the selected anatomy.
  • the term "memory”, in the context of the present disclosure, can refer to any type of temporary or more enduring data storage workspace used for storing and operating upon image data in a computer system.
  • the memory could be, for example, a long-term storage medium such as magnetic or optical storage.
  • the memory could be an electronic circuit, such as random-access memory (RAM) that is used as a temporary buffer or workspace by a microprocessor or other control logic processor device.
  • Display data for example, is typically stored in a temporary buffer and refreshed as needed in order to provide displayed data.
  • This temporary storage buffer can also be considered to be a memory.
  • Memory can be volatile, non- volatile, or a hybrid combination of volatile and non-volatile types.
  • FIG. 13 is shown by way of example to illustrate one type of computer apparatus for performing the processing of the present invention. It should be noted that algorithms and displays presented herein are not inherently related to any particular computer or other apparatus. Various general-purpose systems may also be used with programs in accordance with the teachings herein, or it may prove convenient to construct more specialized apparatus to perform the required method steps. The required structure for a variety of these systems will appear from the description given earlier.
  • FIGS. 14A and 14B show a user interface provided on display monitor 40 for entry of user instructions that can affect the display of a volume image 42 of paranasal region structures and the airway path 44.
  • a user control 46 such as a slider as shown, enables the user to adjust the visibility of airway path 44 or of related features.
  • a menu 48 provides functions such as color selection and enable/disable capabilities for display of various anatomical and airway features as well as for entry of parameters for display.
  • the operator interface may also present a virtual endoscopic view.
  • control logic replicates the field of view that would be obtained using an endoscopic probe passed through airway channels.
  • the virtual endoscopy view is displayed to the user along with a 3D rendering of the image and axial, sagittal, and coronal slices. The location of the endoscopic view is indicated in the 3D rendering and slices. Also, the direction of the view is indicated in the 3D rendering.
  • Figure 15 shows a view of the virtual endoscopic interface, with sections of the display screen shown in block form.
  • a side view 1510 shows a volume rendering of a portion of the sinus passage and represents the relative, computed position of the virtual endoscopic probe.
  • Axial, sagittal, and coronal views 1530, 1532, and 1534, respectively, are provided for the sinus path at the position of the virtual endoscopic probe.
  • An endoscopic view 1520 is then reconstructed using the corresponding volume information.
  • a control panel 1560 provides various utilities for controlling the presentation of this virtual information. This can include setting variables that control speed of refresh of the display for advancing the virtual probe, forward or reversed direction of the probe, and frame or slice within a volume.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Medical Informatics (AREA)
  • Physics & Mathematics (AREA)
  • Public Health (AREA)
  • Biomedical Technology (AREA)
  • Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Optics & Photonics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Veterinary Medicine (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biophysics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Physiology (AREA)
  • Computer Graphics (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Architecture (AREA)
  • Neurosurgery (AREA)
  • Pulmonology (AREA)
  • Neurology (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Primary Health Care (AREA)
  • Epidemiology (AREA)
  • Databases & Information Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Surgical Instruments (AREA)
  • Dental Tools And Instruments Or Auxiliary Dental Instruments (AREA)

Abstract

L'invention concerne un procédé d'affichage, d'une région de sinus aérien d'un patient, qui est exécuté au moins en partie sur un ordinateur et qui consiste à acquérir des données d'image de volume de la région de sinus aérien du patient, à identifier une ou plusieurs voies aériennes à l'intérieur de la région de sinus aérien à partir des données d'image de volume, à afficher la ou les voies aériennes et à mettre en relief une ou plusieurs parties des voies aériennes affichées qui sont rétrécies au-dessous d'une valeur prédéterminée.
PCT/US2012/042005 2011-07-21 2012-06-12 Système d'analyse de sinus aérien et de cavité nasale Ceased WO2013012492A2 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/131,694 US9974503B2 (en) 2011-07-21 2012-06-12 System for paranasal sinus and nasal cavity analysis

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201161510187P 2011-07-21 2011-07-21
US61/510,187 2011-07-21

Publications (2)

Publication Number Publication Date
WO2013012492A2 true WO2013012492A2 (fr) 2013-01-24
WO2013012492A3 WO2013012492A3 (fr) 2013-07-11

Family

ID=47558655

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2012/042005 Ceased WO2013012492A2 (fr) 2011-07-21 2012-06-12 Système d'analyse de sinus aérien et de cavité nasale

Country Status (2)

Country Link
US (1) US9974503B2 (fr)
WO (1) WO2013012492A2 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3127085A1 (fr) * 2014-04-01 2017-02-08 Scopis GmbH Procédé de segmentation et de visualisation
JP2017042616A (ja) * 2015-08-26 2017-03-02 バイオセンス・ウエブスター・(イスラエル)・リミテッドBiosense Webster (Israel), Ltd. 追跡迷路の問題解決法を用いた自動的ent手術事前計画
US11324566B2 (en) 2014-10-31 2022-05-10 Stryker European Operations Limited Instrument guidance system for sinus surgery

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102011078212B4 (de) 2011-06-28 2017-06-29 Scopis Gmbh Verfahren und Vorrichtung zum Darstellen eines Objektes
DE102013211055B3 (de) 2013-06-13 2014-09-18 Scopis Gmbh Adapter zur Aufnahme eines medizintechnischen Geräts sowie eines Lageerfassungssystems
ES2608037B1 (es) * 2015-10-01 2018-01-26 Lucia JÁÑEZ GARCÍA Sistema y método para segmentación y análisis automatizados de la estructura tridimensional de conductos en imágenes de tomografía computarizada
US20210093381A9 (en) * 2016-06-20 2021-04-01 Kevin Raynard Smith System & method for matching the results of a CT scan to a nasal-sinus surgery plan to treat migraine headaches
ES2608861B2 (es) * 2016-06-30 2017-10-26 Universidad De Málaga Dispositivo, sistema y procedimiento para la obtención de curvas rinomanométricas computacionales
GB2555468B (en) * 2016-10-31 2020-05-27 Bitplane Ag Visualization system and method for 3D scenes
US10299699B2 (en) * 2016-11-28 2019-05-28 Biosense Webster (Israel) Ltd. Computerized tomography image correction
US10631798B2 (en) 2017-01-16 2020-04-28 Biosense Webster (Israel) Ltd. Seeing through mucus in an ENT procedure
US10699415B2 (en) * 2017-08-31 2020-06-30 Council Of Scientific & Industrial Research Method and system for automatic volumetric-segmentation of human upper respiratory tract
US11132830B2 (en) * 2018-03-29 2021-09-28 Biosense Webster (Israel) Ltd. Static virtual camera positioning
CN113034489B (zh) * 2021-04-16 2022-11-01 南方医科大学第五附属医院 基于深度学习的人工智能鼻窦ct图像处理系统
KR102650549B1 (ko) * 2022-02-22 2024-03-26 경희대학교 산학협력단 콘빔 컴퓨터 단층촬영(cbct) 영상을 이용한 인두 기도 분할 장치 및 그 방법

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2583738C (fr) * 2004-10-05 2015-08-11 Universiteit Antwerpen Diagnostic et traitement de l'apnee du sommeil
US7536216B2 (en) * 2004-10-18 2009-05-19 Siemens Medical Solutions Usa, Inc. Method and system for virtual endoscopy with guidance for biopsy
US7794399B2 (en) * 2006-01-24 2010-09-14 Tamir Cohen System and method for three-dimensional airway reconstruction, assessment and analysis
US20080031408A1 (en) * 2006-08-07 2008-02-07 Predrag Sukovic Quantification of sinus problems in a patient
US20110227910A1 (en) * 2008-03-27 2011-09-22 Analogic Corporation Method of and system for three-dimensional workstation for security and medical applications

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3127085A1 (fr) * 2014-04-01 2017-02-08 Scopis GmbH Procédé de segmentation et de visualisation
US10235759B2 (en) 2014-04-01 2019-03-19 Scopis Gmbh Method for cell envelope segmentation and visualisation
US11324566B2 (en) 2014-10-31 2022-05-10 Stryker European Operations Limited Instrument guidance system for sinus surgery
JP2017042616A (ja) * 2015-08-26 2017-03-02 バイオセンス・ウエブスター・(イスラエル)・リミテッドBiosense Webster (Israel), Ltd. 追跡迷路の問題解決法を用いた自動的ent手術事前計画
EP3138496A1 (fr) * 2015-08-26 2017-03-08 Biosense Webster (Israel) Ltd. Pré-planification chirurgicale ent automatique utilisant un labyrinthe de retour-arrière de solution de problème

Also Published As

Publication number Publication date
US20140330115A1 (en) 2014-11-06
WO2013012492A3 (fr) 2013-07-11
US9974503B2 (en) 2018-05-22

Similar Documents

Publication Publication Date Title
US9974503B2 (en) System for paranasal sinus and nasal cavity analysis
Chen et al. Automatic segmentation of individual tooth in dental CBCT images from tooth surface map by a multi-task FCN
JP6877868B2 (ja) 画像処理装置、画像処理方法および画像処理プログラム
CN106663309B (zh) 用于医学成像中的用户引导的骨骼分段的方法和存储介质
US9788729B2 (en) Image processing apparatus and image processing method
US20110254845A1 (en) Image processing method and image processing apparatus
US8483462B2 (en) Object centric data reformation with application to rib visualization
WO2018120644A1 (fr) Procédé et système d'extraction de vaisseau sanguin
WO2009103046A2 (fr) Système et procédé de rédaction de compte rendu d’imagerie médicale
US20170148173A1 (en) Method for cell envelope segmentation and visualisation
JP2011514190A (ja) 画像内の解剖学的構造のセグメンテーションおよびモデリングのための方法およびシステム
US9129391B2 (en) Semi-automated preoperative resection planning
JP2006198411A (ja) 心筋層の損傷の可視化方法
CN107194909A (zh) 医用图像处理装置及医用图像处理程序
JP2015066311A (ja) 画像処理装置、画像処理方法、画像処理装置の制御プログラム、記録媒体
Lu et al. Highly accurate facial nerve segmentation refinement from CBCT/CT imaging using a super-resolution classification approach
EP4208850A1 (fr) Système et procédé pour pipeline de pancréatographie virtuelle
Kale et al. Automatic segmentation of human facial tissue by MRI–CT fusion: A feasibility study
US20040264778A1 (en) System and method for the detection of shapes in images
US20080084415A1 (en) Orientation of 3-dimensional displays as a function of the regions to be examined
Preim et al. Visualization, visual analytics and virtual reality in medicine: State-of-the-art Techniques and Applications
JP2012085833A (ja) 3次元医用画像データの画像処理システム、その画像処理方法及びプログラム
CN112884879B (zh) 用于提供至少一个管状结构的二维展开图像的方法
Seo et al. Semiautomatic segmentation of nasal airway based on collaborative environment
Hanssen et al. Nerves-level sets for interactive 3D segmentation of nerve channels

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 14131694

Country of ref document: US

122 Ep: pct application non-entry in european phase

Ref document number: 12814608

Country of ref document: EP

Kind code of ref document: A2