[go: up one dir, main page]

WO2012012576A1 - Systèmes d'analyse d'images utilisant des techniques de traitement de données non linéaires et leurs procédés d'utilisation - Google Patents

Systèmes d'analyse d'images utilisant des techniques de traitement de données non linéaires et leurs procédés d'utilisation Download PDF

Info

Publication number
WO2012012576A1
WO2012012576A1 PCT/US2011/044746 US2011044746W WO2012012576A1 WO 2012012576 A1 WO2012012576 A1 WO 2012012576A1 US 2011044746 W US2011044746 W US 2011044746W WO 2012012576 A1 WO2012012576 A1 WO 2012012576A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
linear
analysis system
reference image
data processing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/US2011/044746
Other languages
English (en)
Inventor
Steve Kacenjar
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lockheed Martin Corp
Original Assignee
Lockheed Corp
Lockheed Martin Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lockheed Corp, Lockheed Martin Corp filed Critical Lockheed Corp
Priority to JP2013520848A priority Critical patent/JP2013536500A/ja
Priority to EP11810372.0A priority patent/EP2596455A1/fr
Priority to AU2011281065A priority patent/AU2011281065A1/en
Publication of WO2012012576A1 publication Critical patent/WO2012012576A1/fr
Priority to US13/672,530 priority patent/US20130188878A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/14Transformations for image registration, e.g. adjusting or mapping for alignment of images
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/35Determination of transform parameters for the alignment of images, i.e. image registration using statistical methods
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30088Skin; Dermal
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation

Definitions

  • the present invention generally relates to image analysis, and, more specifically, to automated registration and analysis of time sequence images.
  • fields such as, for example, earth remote sensing, aerospace systems and medical imaging
  • searching for time-dependent, regional changes of significance (e.g., material stress patterns, surface roughness, changes in inclusions and the like) across a generalized deformable surface can be complicated by extraneous factors including, for example, target movement, image acquisition device geometries, color, lighting and background clutter changes.
  • standard, rigid-body registration techniques often can fail to address and correct for these extraneous factors, which can prevent adequate image overlayment from be realized, thereby leading to an incorrect assessment of change over the deformable surface between time sequence images.
  • a generalized deformable surface will refer to any surface that does not deform uniformly when subjected to an external or internal stress during a series of observations.
  • a generalized deformable surface possesses color, thermal, conductive and/or polarimetric temporal variance due to factors such as, for example, source lighting conditions and/or physical chemistry surface alterations during a series of observations.
  • application of an external or internal stress can cause the surface to deform in a non-linear fashion such that inclusions thereon can be affected in both two- and three-dimensions.
  • inclusions contained upon the generalized deformable surface may not move the same amount relative to one another when the surface is deformed and the surface's measurable contrast can vary between itself and background due to the deformation.
  • inclusion will refer to any spatially localized characteristic in an image that differs from image background.
  • Illustrative examples of inclusions that can be present on a generalized deformable surface can include, without limitation, buildings, rocks, trees, fingerprints, skin pores, moles, and the like.
  • source illumination and/or chemical changes upon the deformable surface can also result in superficial changes that can alter reflective properties that can superficially alter the appearance of the inclusions.
  • both surface deformation and surface physical changes can result in superficial artifacts that are not indicative of actual changes to the inclusions. This type of non-uniform spatial movements and appearance changes can make image registration especially problematic.
  • imprinted patterns superimposed across a deformable surface can also be spatially variable but distinct from the inclusions of interest in an image (e.g., a building complex representing an inclusion of interest can be embedded in a field of trees that is swaying in the wind, where the trees represent a time variant background that is not rigidly positioned in the image).
  • an image registration process needs to be capable of handling such time variant background.
  • image analysis systems described herein include at least one image collection device, an image processing device operating a non-linear data processing algorithm, and at least one data output device.
  • the image processing device is operable to overlay a test image and a reference image upon one another and perform a comparison therebetween.
  • image analysis systems described herein include at least one image collection device, an image processing device operating non-linear data processing algorithm, and at least one data output device.
  • the image processing device is operable to overlay a test image and a reference image upon one another and perform a comparison therebetween by processing both linear parameters and non-linear parameters, where each image contains a plurality of inclusions.
  • the non-linear data processing algorithm is selected from the group consisting of a particle swarm optimizer, a neural network, a genetic algorithm, and any combination thereof.
  • methods described herein include acquiring a reference image containing a plurality of inclusions, acquiring a test image containing the plurality of inclusions, overlaying the test image upon the reference image by using a non-linear data processing algorithm, and producing an output that illustrates any differences between the test image and the reference image after overlaying takes place.
  • FIGURES 1A and IB show illustrative images containing a plurality of mole inclusions across a patient's back taken with different camera orientations and lighting conditions;
  • FIGURES 1C and ID show illustrative images of a single mole inclusion thereon acquired with the different camera orientations and lighting conditions;
  • FIGURE 2 shows an illustrative flowchart demonstrating how time sequence images can be overlaid in a particular embodiment
  • FIGURE 3 shows an illustrative flowchart demonstrating how time sequence images can be overlaid in another particular embodiment
  • FIGURES 4A and 4B show illustrative test and reference images of a mole inclusion before and after alignment, respectively;
  • FIGURE 4C shows an illustrative difference image of the misaligned images in FIGURE 4A; and
  • FIGURE 4D shows an illustrative difference image of the aligned images in FIGURE 4B;
  • FIGURE 5 A shows an illustrative 4D scatter plot of mapping coefficients for four parameters (translation, rotation, magnification and background color) before processing with a particle swarm optimizer
  • FIGURES 5B - 5D show illustrative 2D scatter plots of rotation, magnification and translation parameters before processing with a particle swarm optimizer
  • FIGURES 5E - 5H show illustrative plots corresponding to those of FIGURES 5A - 5D, illustrating the convergence of mapping coefficients after processing with the particle swarm optimizer.
  • the present disclosure is directed, in part, to image analysis systems that utilize a non-linear data processing algorithm to detect and characterize changes between time sequence images.
  • the present disclosure is also directed, in part, to methods for analyzing time sequence images, including those having time-variant background clutter, using a non-linear data processing algorithm.
  • mapping coefficients will refer to one of the outputs of the image analysis system.
  • the initial mapping coefficients determined from processing of linear parameters can be fed into a non-linear data processing algorithm as initial estimated parameters of an inclusion's location.
  • estimated parameters of an inclusion's location can be determined from an initial coarse alignment based upon rigid body alignment techniques. Using the estimated solution of an inclusion's location can advantageously provide a more rapid convergence of the non-linear data processing algorithm in determining finalized mapping coefficients.
  • Mapping coefficients can include the transformation coefficients that minimize differences across a reference image and a test image that result from geometric alterations and surface reflective properties.
  • Time-variant background clutter can arise from the surface being imaged and/or from sensor noise within an image collection device being used for detection, for example.
  • body hair and varying skin pigmentation can complicate the registration of skin images.
  • image parameters such as, for example, differing camera angles, lighting, magnification and the like can complicate an image overlay and registration process.
  • FIGURES 1A and IB show illustrative images containing a plurality of mole inclusions across a patient's back taken with different camera orientations and lighting conditions
  • FIGURES 1C and ID show illustrative images of a single mole inclusion thereon acquired with the different camera orientations and lighting conditions.
  • FIGURES 1A - I D the issues associated with the misalignment of multiple inclusions (moles) can be a particularly daunting, given the number of inclusions involved and their non-uniform degree of deformation in a series of images.
  • image overlay can be performed by individually translating and rotating images of each inclusion and either manually or electronically overlaying the images.
  • image collection device rotation and tilt e.g. , image collection device shear
  • magnification image tone
  • image gain time-variant background changes
  • both single modality image collection devices and multiple modality image collection devices can be used.
  • at least two different types of image collection devices can be used to investigate different attributes of inclusions located within an image.
  • time sequence visual images can be superimposed with time sequence thermal images, polarimetric images, radiographic images, magnetic images, and/or the like in order to develop a more effective and informative inclusion overlay.
  • changes in an inclusion can be characterized in terms of regional size differences, color differences, asymmetry changes, and boundary changes, for example.
  • these changes can be further augmented with changes such as, for example, density differences, chemical differences, magnetic differences, and/or polarimetric differences.
  • one such attribute can be essentially fixed in an image, such that an inclusion being imaged can be oriented with respect to the fixed point (e.g., another inclusion that does not change), thereby constituting a geographical information system (GIS).
  • GIS geographical information system
  • the present image analysis systems and related methods can find particular utility.
  • the present image analysis systems and methods can be especially useful in fields including, for example, medical imaging, structural fatigue monitoring, satellite imaging, geological testing and surface chemistry monitoring.
  • images obtained in these fields and others can have inclusions located upon a deformable surface.
  • the skin and underlying tissue can exhibit differential elasticity (e.g., due to weight gain or loss or a change in musculature) and make its surface spatially deformable.
  • changing skin pigmentation and hair covering can represent time- variant background clutter that can complicate the overlay of skin images.
  • the earth's surface can similarly be considered to be deformable.
  • a bendable surface e.g., an airplane wing or a structural support
  • the relative positions of inclusions thereon e.g., rivets
  • the change in relative positions of inclusions located on a bendable surface can be used as a means to gauge structural fatigue.
  • the morphological classification of skin lesions ("moles") and monitoring them over time is important for the detection of melanoma and other types of skin cancer.
  • the present image analysis systems and methods can be particularly advantageous for these types of dermatology applications.
  • observation of changes in the color, shape and size of moles over time can lead to the early detection of skin cancer while it is still readily treatable.
  • typical patients have several hundred moles, all of which need to be monitored over time, which can complicate visual inspection efforts.
  • a skin cancer may have already metastasized beyond its point of origin and become much more difficult to treat.
  • the present image analysis systems and methods can also be used for monitoring other skin conditions including, for example, rashes, burns and healing.
  • fixed inclusions such as, for example, skin pores can be utilized as fixed reference points that do not substantially change during the course of acquiring time sequence images.
  • the present image analysis systems and methods can also be extended to subsurface imaging such as, for example, breast mammography and internal imaging such as, for example, colon, stomach, esophageal and lung imaging.
  • subsurface imaging such as, for example, breast mammography
  • internal imaging such as, for example, colon, stomach, esophageal and lung imaging.
  • the present image analysis systems and methods are not limited to visual images, particularly in the medical field. Particularly, overlay and comparison of images such as, for example, PET, SPECT, X-RAY, CT, CAT, MRI and other like images can be accomplished with the present image analysis systems and methods. Appropriate imaging protocols using these imaging techniques will be evident to one having ordinary skill in the art.
  • Computer hardware used to implement the various illustrative blocks, modules, elements, components, methods and algorithms described herein can include a processor configured to execute one or more sequences of instructions, programming or code stored on a readable medium.
  • the processor can be, for example, a general purpose microprocessor, a microcontroller, a graphical processing unit, a digital signal processor, an application specific integrated circuit, a field programmable gate array, a programmable logic device, a controller, a state machine, a gated logic, discrete hardware components, or any like suitable entity that can perform calculations or other manipulations of data.
  • computer hardware can further include elements such as, for example, a memory [e.g., random access memory (RAM), flash memory, read only memory (ROM), programmable read only memory (PROM), erasable PROM], registers, hard disks, removable disks, CD-ROMS, DVDs, or any other like suitable storage device.
  • RAM random access memory
  • ROM read only memory
  • PROM programmable read only memory
  • PROM erasable PROM
  • registers hard disks, removable disks, CD-ROMS, DVDs, or any other like suitable storage device.
  • non-linear data processing algorithms and other executable sequences described herein can be implemented with one or more sequences of code contained in a memory.
  • such code can be read into the memory from another machine-readable medium.
  • Execution of the sequences of instructions contained in the memory can cause a processor to perform the process steps described herein.
  • processors in a multi-processing arrangement can also be employed to execute instruction sequences in the memory.
  • hard-wired circuitry can be used in place of or in combination with software instructions to implement various embodiments described herein. Thus, the present embodiments are not limited to any specific combination of hardware and software.
  • a machine-readable medium will refer to any medium that directly or indirectly provides instructions to a processor for execution.
  • a machine- readable medium can take on many forms including, for example, non-volatile media, volatile media, and transmission media.
  • Non-volatile media can include, for example, optical and magnetic disks.
  • Volatile media can include, for example, dynamic memory.
  • Transmission media can include, for example, coaxial cables, wire, fiber optics, and wires that form a bus.
  • Machine-readable media can include, for example, floppy disks, flexible disks, hard disks, magnetic tapes, other like magnetic media, CD-ROMs, DVDs, other like optical media, punch cards, paper tapes and like physical media with patterned holes, RAM, ROM, PROM, EPROM and flash EPROM.
  • image analysis systems described herein include at least one image collection device, an image processing device operating a non-linear data processing algorithm, and at least one data output device.
  • the image processing device is operable to overlay a test image and a reference image upon one another and perform a comparison therebetween.
  • a non-linear data processing algorithm will refer to a class of algorithms for characterizing a geometric transformation used in overlaying two or more images that contain inclusions, particularly images that have a changing background and are subject to surface deformation.
  • a non-linear data processing algorithm can utilize parameters that are not described by the inclusions' translational or rotational coordinates (e.g., spectral, thermal, radiographic, magnetic, polarimetric parameters, and/or the like).
  • Such geometric transformations can include both linear translational mappings as well as higher-order mappings such as, for example, image rotation, shear, magnification and the like.
  • non-linear data processing algorithm can provide image background normalization coefficient estimates to address reflective and color differences between the test image and the reference image.
  • non-linear data processing algorithm can include various preprocessing operations that can be performed prior to performing the geometric transformation.
  • Illustrative pre-processing operations can include, for example, morphological filtering of the image and spatial image sharpening.
  • the images can be subdivided into a plurality of sectors prior to applying the non-linear data processing algorithm.
  • Illustrative non-linear data processing algorithms can include, for example, particle swarm optimizers, neural networks, genetic algorithms, unsharp masking, image segmentation, morphological filtering and any combination thereof. These types of non- linear data processing algorithms will be familiar to one having ordinary skill in the art. Although certain details in the description that follows are directed to particle swarm optimizers, it is to be recognized that a particle swarm optimizer can be replaced by or used in combination with any suitable non-linear data processing algorithm, including those set forth above.
  • the non-linear data processing algorithm can be a particle swarm optimizer.
  • particle swarm optimization is a computational technique that optimizes a problem by iteratively seeking to improve upon a candidate solution with regard to a given measure of quality.
  • Particle swarm optimization techniques involve moving a population of particles (e.g., inclusions, which are state vectors, that are described by various parameters being fed into a model) toward a candidate solution for each particle according to simple mathematical formulas relating to the state vector for each particle within a state space.
  • a "state vector" will describe a potential candidate solution for a set of input parameters (both linear parameters and non-linear parameters) that minimizes differences between a reference image and a test image.
  • a two- parameter state vector can be used to describe each particle in a particle swarm.
  • Related two-dimensional state spaces and higher order state spaces are also contemplated by the embodiments described herein.
  • Each particle of a particle swarm has a unique location that corresponds to unique rotation and magnification parameters, for example, in an illustrative two- dimensional state space.
  • the parameters can be used to distort the test image, which can then be compared to the reference image.
  • distortion of the test image can take place by mapping each pixel from the original target space into new locations and then performing a re-sampling of the distorted image to check for convergence.
  • This comparison can take on several different forms such as, for example, an objective function used by the particle swarm optimizer (e.g., differential entropy, Hamming distance, and/or the like).
  • a particle's movement is influenced by its best known local position, which is influenced by the value of the objective function that is computed during a particular iteration.
  • Each particle is also guided toward the best known positions in the state space, which are continually updated as better positions are found by other particles. That is, the iteratively determined location for a given particle is influenced by (1) its position that gives its minimum objective function value during any previous iteration and (2) the optimal position identified by the particle swarm as provided by the minimization of objective function values across the entire particle swarm.
  • Each iteration is expected to move the particle swarm toward the best global solution for the particle positions. This process can be generalized to as many parameters as required to minimize mapping differences.
  • a particle swarm optimizer can be an especially useful non-linear data processing algorithm for addressing the time-changing environment across image pairs.
  • the presence of inclusions and background features can be simultaneously evaluated, since each pixel of the test image and the reference image can be compared.
  • an objective function can be computed and recorded.
  • the inclusions form a fixed reference over which the objective function can be minimized as the particle swarm evolves.
  • the time-variant background can convey random noise to the measurement of the objective function, which can be addressed through successive iterations that converge toward the mapping coefficients of the inclusions of interest within the images.
  • the present image processing systems and methods can detect changes in the shape, size and boundary conditions for a plurality of inclusions over a period of time.
  • detection of such changes can involve acquisition of a reference image and then acquisition of at least one test image at a later time.
  • an initial coarse alignment of the plurality of inclusions in the test image can be performed upon the plurality of inclusions in the reference image. By performing an initial coarse alignment of the plurality of inclusions, a more rapid convergence of the non-linear data processing algorithm can be realized when aligning the inclusions.
  • coarse alignment can be performed manually.
  • a hybrid landmark/intensity-based registration method can be used to identify tie-points across each image in order to perform coarse alignment. For example, invariant inclusions on the surface being imaged can be established as markers for performing image alignment. In some embodiments, an optical matched filter can be used in performing the coarse alignment. It should be noted that in the embodiments described herein, the inclusions in the reference image are held fixed, while the inclusions in the test image are transformed to their optimized positions using the non-linear data processing algorithm.
  • an Affine transformation or a Perspective transformation can be used during or subsequent to utilizing the non-linear data processing algorithm.
  • higher order model generalizations can be used in overlaying a test image upon a reference image. The foregoing transformations can account for non-linear parameters in a test image and a reference image and allow sectors of the test image to be deformed onto the reference image, as described in more detail below.
  • an Affine transformation involves a geometric spatial transformation (e.g. , rotation, scaling, and/or shear) and a translation (movement) of an inclusion.
  • a generalized Perspective transformation can be used to handle higher dimensional surface topographies.
  • the image processing device can be operable for subdividing each image into a plurality of sectors and determining a set of mapping coefficients for each of the plurality of sectors.
  • the image processing device can be operable to deform each sector in the test image onto a corresponding sector in the reference image, after determining the set of mapping coefficients for each sector, thereby overlaying the inclusions therein. By deforming each sector in a test image onto a corresponding sector in a reference image, inclusions therein can be overlaid and compared for differences according to some embodiments.
  • the image processing device can process both linear parameters and non-linear parameters in overlaying the test image and the reference image.
  • the image processing device can be operable to determine morphological changes that occur in inclusions in the test image relative to the reference image. In some embodiments, these changes can be listed as a signature vector for the inclusions. Attributes of the signature vector can include, for example, changes in aerial size, inclusion spatial asymmetry, inclusion boundary characterization, color changes, and the like.
  • the image processing device can be operable to provide visual depictions of each element of the signature vectors or combined depictions of the elements of the signature vectors as Geographical Information System (GIS) information maps that depict the type and magnitude of changes that exist across each inclusion.
  • GIS Geographical Information System
  • linear parameters are the modeling coefficients that describe the linear translation between a test image and a reference image.
  • Linear parameters include vector quantities that describe an inclusion's real position in three- dimensional space, particularly x-, y- and z-coordinates.
  • non-linear parameters are the modeling parameters used in the non-linear data processing algorithm, including, for example, rotation, magnification, shear and the like. Collectively, the linear parameters and the non-linear parameters can alter the apparent real position or appearance of an inclusion in two- and three-dimensional space.
  • the image processing device can process the linear parameters prior to processing the non-linear parameters.
  • the linear parameters of the state vector are easier to address computationally and can be used to achieve a better initial solution for the position of each inclusion.
  • the initial solution can be fed into the non-linear data processing algorithm when the non-linear parameters are processed.
  • the non-linear parameters can be processed to "fine tune" the optimal linear position for the mapping of sectors in the test image onto corresponding sectors in the reference image. This can provide an enhanced non-linear correction.
  • both the linear parameters and the non-linear parameters can be processed in each iteration of the non-linear data processing algorithm.
  • the linear parameters can be processed separately prior to using the nonlinear data processing algorithm.
  • only the linear parameters are processed initially by the non-linear data processing algorithm, and the non-linear parameters are temporarily ignored.
  • the non-linear parameters can be processed separately or in combination with the linear parameters.
  • Such initial processing of the linear parameters can advantageously increase processing speed.
  • the non-linear parameters can be initially processed by a processing algorithm that is separate from the non-linear data processing algorithm, before an initial solution for the inclusions' positions is fed into the non-linear data processing algorithm.
  • non-linear parameters are processed using the non-linear data processing algorithm.
  • linear parameters can many times be effectively addressed through standard image processing techniques, as noted above.
  • standard techniques can be inefficient when addressing the nonlinear parameters related to the images.
  • the non-linear data processing algorithms used in the present embodiments can be particularly adept at addressing the non-linear parameters associated with the geometric transformation used in the non-linear data processing algorithm.
  • the convergence rate can nearly double by having the non-linear data processing algorithm process only the non-linear parameters. In some embodiments, the increase in convergence rate can be even greater.
  • overlay of the test image and the reference image can be iteratively performed for a fixed number of cycles. In other embodiments, overlay of the test image and the reference image can be iteratively performed using the nonlinear data processing algorithm until a desired degree of convergence is reached through optimization. In some embodiments, convergence can be determined when an objective function within the test image is minimized or a difference of the objective function is minimized between iterations. That is, in such embodiments, convergence can be determined when the error (as measured by the change in objective function between iterations) between the test image and the reference image is minimized.
  • Illustrative objective functions can include, for example, image entropy, hamming distance, gray level per band, mutual information estimation, and any combination thereof.
  • the non-linear data processing algorithm can be used to find a global minimum across each sector by adjusting the mapping coefficients. Once the optimal values for the mapping coefficients have been determined, any remaining differences can be characterized in terms of morphological changes in the inclusions within an image or due to residual alignment error.
  • the inclusion of non-linear parameters advantageously can provide better registration and change sensitivity detection between corresponding sectors within a test image and a reference image. When only linear parameters are processed to affect registration, higher levels of systematic errors can be introduced.
  • processing can be performed until mapping coefficient estimates and/or objective function estimates in successive iterations differ by less than a user defined value. It is to be recognized that a desired degree of convergence will vary depending upon the intended application in which the image analysis system is used. Some applications may require a tighter convergence, while others will require less.
  • the sectors in the test image and the reference image are substantially identical in size. In other embodiments, the sectors in the test image can be larger than the sectors in the reference image. Advantages of making the sectors in the test image larger can include allowing any residual error in sector positions remaining after the linear parameters are initially processed to be adequately compensated for when the non-linear parameters are processed using the non-linear data processing algorithm.
  • any non-zero entropy difference either represents morphological changes in the inclusion(s) over time or residual alignment error from the non-linear data processing algorithm.
  • image processing device is operable to determine any differences between the test image and the reference image for each inclusion after the overlay has been performed.
  • image comparison on an inclusion-by- inclusion basis can be performed by visual inspection after the overlay has been performed.
  • image comparison can be performed by the image processing device (e.g., a computer or graphical processing unit) on a regional- or pixel- based basis.
  • factors that can influence the overlay efficiency and the accurate determination of a difference output include, for example, the ability to correct for global or local background alterations and local surface deformation about each inclusion.
  • test image and the reference image can take place in any order. That is, in various embodiments, the test image can be acquired either before or after the reference image.
  • the processes described herein can provide mapping coefficient regardless of the acquisition order or if the roles of the images are changed.
  • FIGURE 2 shows an illustrative flowchart demonstrating how time sequence images can be overlaid in a particular embodiment.
  • the non-linear data processing algorithm is a particle swarm optimizer.
  • operation 200 a reference image is acquired at a first time.
  • a particle swarm model can be applied in operation 201 in order to generate a population of synthetic images in operation 202 that provides objective function information 203, which can later be used in analyzing a test image. This operation can provide an initial topography assessment of the state space.
  • a test image is acquired in operation 204.
  • a convergence check 205 is applied to test the goodness of fit of the inclusion overlay in the test image and the reference image.
  • the comparison between images can take place over the entire image or between sub-image sectors within the entire image.
  • Objective function information 203 can include differential entropy between the test image (or sector) and a reference image (or sector). If the overlay has not converged to a desired degree, the particle swarm model can be applied again, and the convergence check repeated.
  • the parameters of the inclusions in the test image become part of the objective function information 203 that is used in further assessing the goodness of fit for each inclusion.
  • Operation 206 can involve a deformation of sectors containing the inclusions in the reference image using a geometric transformation (e.g., an Affine transformation or a Perspective transformation) in some embodiments.
  • a geometric transformation e.g., an Affine transformation or a Perspective transformation
  • changes in the inclusions between the test image and the reference image can be assessed in operation 207, and an output illustrating the differences for each inclusion can be produced in operation 208.
  • all the inclusions are illustrated in the output.
  • the output can be filtered such that only inclusions having selected physical attributes (e.g., size, color and/or aspect ration) are indicated as being changed between the test image and the reference image.
  • FIGURE 3 shows an illustrative flowchart demonstrating how time sequence images can be overlaid in another particular embodiment.
  • reference image data and test image data can be collected in operations 301 and 304, respectively, and partitioned into sectors in operations 302 and 305. Morphological filtering of the images can then take place in operations 303 and 306, which can remove background clutter from the images. Thereafter, a "quick-look" difference of the reference image and the test image can be performed in operation 307. Spatial image sharpening of the test image and the reference image can be performed in operation 308. Processing of linear image parameters can then be used to produce a translational estimation for each sector of the image overlay in operatio 309.
  • a sector translation vector assessment can be generated for each sector in operation 310, followed by test sector redicing of the original test image in operation 311. Based upon the estimated translational differences, a revised test image partition can be generated in operation 312. Any of the foregoing operations can be performed iteratively in order to achieve a desired degree of convergence for the translational overlay of the test image and the reference image.
  • a particle swarm optimizer can be used in operation 313 to further refine the positions of the inclusions within the various sectors. Thereafter, the test image and the reference image can be registered in operation 314 and a change assessment in the images can be performed in operation 315. Again, any of the operations for processing the nonlinear parameters can also be processed iteratively to achieve a desired degree of convergence. An output can be produced in the form of a change map output in operation 316.
  • FIGURES 4A - 4D show an illustrative series of images before and after alignment using the present image analysis systems and methods, and the corresponding difference images produced in each case.
  • FIGURES 4 A and 4B show illustrative test and reference images of a mole inclusion before and after alignment, respectively.
  • FIGURE 4C shows an illustrative difference image of the misaligned images in FIGURE 4A.
  • FIGURE 4D shows an illustrative difference image of the aligned images in FIGURE 4B.
  • the difference image of FIGURE 4C might be interpreted by the image analysis system as a significant change.
  • FIGURE 5A shows an illustrative 4D scatter plot of mapping coefficients for four parameters (translation, rotation, magnification and background color) before processing with a particle swarm optimizer.
  • FIGURES 5B - 5D show illustrative 2D scatter plots of rotation, magnification and translation parameters before processing with a particle swarm optimizer.
  • FIGURES 5E - 5H show illustrative plots corresponding to those of FIGURES 5A - 5D illustrating the convergence of mapping coefficients after processing with the particle swarm optimizer.
  • the image collection device can acquire a visual image such as a photograph.
  • the image collection device can be a camera.
  • image collection devices other than visual image collection devices can be used.
  • confocal microscopes, magnetic imaging devices (e.g. MRJ) hyperspectral imaging devices, multispectral imaging devices, thermal sensing devices, polarimetric sensing devices, radiometric sensing devices, and any other like sensing device can be used. That is, the present image analysis systems and methods are not limited to the analysis of inclusions contained within visual images.
  • more than one image collection device can be used in overlaying the inclusions in the test image with those in the reference image.
  • a combination of a visual image and a thermal image might be used to produce a more accurate overlay.
  • the visual image might not be significantly changed between a test image and a reference image, but a thermal property of the inclusion might be altered between the two.
  • Other combinations of visual and non-visual imaging techniques or between various non- visual imaging techniques can be envisioned by one having ordinary skill in the art.
  • the present image analysis systems and methods can produce an output via at least one data output device.
  • Suitable data output devices can include, for example, computer monitors, printers, electronic storage devices and the like.
  • the image processing device can produce a difference image at the data output device that highlights any significant changes between the test image and the reference image for any of the inclusions therein.
  • Image differencing is a scalar quantity.
  • Vector quantities can be utilized in image comparison as well.
  • morphological changes in a test image can be represented in the form of a state vector where elements of the state vector correspond to changes in inclusion size, color, geometry and border characteristics. This information can then be presented to a user of the present systems in the form of a Geographical Information System (GIS) where two-dimensional image planes represent the magnitude of each vector component.
  • GIS Geographical Information System
  • the image processing devices described herein can contain a computer.
  • the image processing devices can utilize a graphical processing unit.
  • Such graphical processing units can be part of a computer or they can be a standalone module, if desired.
  • Computers and graphical processing units can utilize any of the previously described computer hardware, software, or other like processing components known in the art.
  • image analysis systems described herein include at least one image collection device, an image processing device operating a particle swarm optimizer, and at least one data output device.
  • the image processing device is operable to overlay a test image and a reference image upon one another and perform a comparison therebetween by processing both linear parameters and non-linear parameters, where each image contains a plurality of inclusions.
  • the test image and the reference image can be subdivided into a plurality of sectors, where each sector contains at least one inclusion.
  • methods for overlaying and analyzing images containing a plurality of inclusions include acquiring a reference image containing a plurality of inclusions, acquiring a test image containing the plurality of inclusions, overlaying the plurality of inclusions in the test image upon the plurality of inclusions in the reference image by using a non-linear data processing algorithm, and producing an output that illustrates any differences for each inclusion between the test image and the reference image after overlaying takes place.
  • the plurality of inclusions can be located on a deformable surface. In other embodiments, the plurality of inclusions can be located on a rigid surface.
  • the methods can further include performing a coarse alignment of the plurality of inclusions in the test image upon the plurality of inclusions in the reference image, prior to using the non-linear data processing algorithm.
  • performing a coarse alignment can be further facilitated by positioning the at least one image collection device and the area being imaged into a standard orientation. For example, a patient being imaged may be requested to stand or sit in a specified orientation from image to image. By employing a standard orientation of the image collection device(s) and the area being imaged, it can be possible to orient the plurality of inclusions in the test image as close as possible to their "correct" positions by minimizing translational-type errors and image processing device alignment-type errors.
  • the present methods can involve dividing the reference image into a plurality of sectors. By performing this operation, the optimal orientation parameters for the image collection device(s) can be determined for each reference sector prior to the analysis of a corresponding sector in the test image. Thus, the local topography about each inclusion in the test image can be initially assessed prior to application of the non-linear data processing algorithm for analyzing the test image.
  • the sectors can be uniform in size. In some embodiments, the sectors can be variable in size. In some embodiments, each sector can contain at least one inclusion. In some embodiments, the sectors are small relative to the overall image space, such that they are substantially rigid on a local basis about each inclusion.
  • the present methods can further include analyzing the reference image using linear parameters to determine an initial topography solution for the test image. As noted above, determination of an initial topography solution for the test image can enhance the convergence rate of the non-linear data processing algorithm.
  • the present methods can further include determining mapping coefficients for the inclusions in the test image and/or the reference image.
  • the linear parameters can be processed before the nonlinear parameters. In some embodiments, only the non-linear parameters are processed using the non-linear data processing algorithm. In some embodiments, an initial optimization of the linear parameters can be fed into the non-linear data processing algorithm and processed with the non-linear parameters. In some embodiments, both linear parameters and non-linear parameters can be used to overlay the sectors in the test image upon the corresponding sector in the reference image.
  • overlaying can be performed iteratively until a desired degree of convergence is reached. In some embodiments, overlaying can be performed iteratively until a fixed number of cycles have been conducted. In some embodiments, a desired degree of convergence can be based upon a rate or amount of change of the mapping coefficients estimated in successive iterations. In some embodiments, the desired degree of convergence can be based up a minimization of an objective function for the plurality of sectors within a test image, or a difference thereof between successive iterations. In some embodiments, the desired degree of convergence can be based upon minimization of an objective function obtained from a difference image generated after overlaying the test image and the reference image.
  • the present methods can further include deforming each sector of the test image onto a corresponding sector of the reference image.
  • each sector can be deformed using an Affine transformation or a Perspective transformation.
  • the output of the present methods can be filtered.
  • the output can be filtered such that only inclusions having selected physical attributes are indicated as being changed between the test image and the reference image.
  • compositions and methods are described in terms of “comprising,” “containing,” or “including” various components or steps, the compositions and methods can also “consist essentially of or “consist of the various components and operations. All numbers and ranges disclosed above can vary by some amount. Whenever a numerical range with a lower limit and an upper limit is disclosed, any number and any subrange falling within the broader range is specifically disclosed. Also, the terms in the claims have their plain, ordinary meaning unless otherwise explicitly and clearly defined by the patentee. If there is any conflict in the usages of a word or term in this specification and one or more patent or other documents that may be incorporated herein by reference, the definitions that are consistent with this specification should be adopted.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Probability & Statistics with Applications (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)

Abstract

La présente invention concerne des systèmes d'analyse d'images qui utilisent un algorithme de traitement de données non linéaires permettant de superposer et de comparer des images chronologiques. Les systèmes d'analyse d'images peuvent comprendre au moins un dispositif de collecte d'images, un dispositif de traitement d'images utilisant un algorithme de traitement de données non linéaires et au moins un dispositif de sortie de données. Le dispositif de traitement d'images peut fonctionner pour superposer une image test et une image de référence et pour réaliser une comparaison entre elles. Des paramètres linéaires et non linéaires peuvent être traités par le dispositif de traitement d'images lors de la réalisation de la superposition. L'invention concerne également des procédés de superposition d'une image test sur une image de référence à l'aide d'un algorithme de traitement de données non linéaires.
PCT/US2011/044746 2010-07-20 2011-07-20 Systèmes d'analyse d'images utilisant des techniques de traitement de données non linéaires et leurs procédés d'utilisation Ceased WO2012012576A1 (fr)

Priority Applications (4)

Application Number Priority Date Filing Date Title
JP2013520848A JP2013536500A (ja) 2010-07-20 2011-07-20 非線形データ処理技術を用いる画像解析システムおよびその使用方法
EP11810372.0A EP2596455A1 (fr) 2010-07-20 2011-07-20 Systèmes d'analyse d'images utilisant des techniques de traitement de données non linéaires et leurs procédés d'utilisation
AU2011281065A AU2011281065A1 (en) 2010-07-20 2011-07-20 Image analysis systems using non-linear data processing techniques and methods using same
US13/672,530 US20130188878A1 (en) 2010-07-20 2012-11-08 Image analysis systems having image sharpening capabilities and methods using same

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US43480610P 2010-07-20 2010-07-20
US201161434806P 2011-01-20 2011-01-20
US61/434,806 2011-01-20

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US13/187,447 Continuation-In-Part US20120020573A1 (en) 2010-07-20 2011-07-20 Image analysis systems using non-linear data processing techniques and methods using same

Publications (1)

Publication Number Publication Date
WO2012012576A1 true WO2012012576A1 (fr) 2012-01-26

Family

ID=45497170

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2011/044746 Ceased WO2012012576A1 (fr) 2010-07-20 2011-07-20 Systèmes d'analyse d'images utilisant des techniques de traitement de données non linéaires et leurs procédés d'utilisation

Country Status (1)

Country Link
WO (1) WO2012012576A1 (fr)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014014647A (ja) * 2012-07-09 2014-01-30 Toshiba Corp 医用画像処理装置及び医用画像処理プログラム
JPWO2014027522A1 (ja) * 2012-08-17 2016-07-25 ソニー株式会社 画像処理装置、画像処理方法、プログラムおよび画像処理システム
JP2016174671A (ja) * 2015-03-19 2016-10-06 株式会社ヒューマン・エンジニアリング 判定装置および判定プログラム
US9569850B2 (en) 2013-10-16 2017-02-14 Cognex Corporation System and method for automatically determining pose of a shape
US10643332B2 (en) 2018-03-29 2020-05-05 Uveye Ltd. Method of vehicle image comparison and system thereof
US10650530B2 (en) 2018-03-29 2020-05-12 Uveye Ltd. Method of vehicle image comparison and system thereof
CN118350406A (zh) * 2024-06-17 2024-07-16 中国华西工程设计建设有限公司 一种软土区路基沉降预测方法及系统

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050196047A1 (en) * 2004-02-03 2005-09-08 Yuri Owechko Object recognition system incorporating swarming domain classifiers
US20050259882A1 (en) * 2004-05-18 2005-11-24 Agfa-Gevaert N.V. Method for automatically mapping of geometric objects in digital medical images
US20090226052A1 (en) * 2003-06-21 2009-09-10 Vincent Fedele Method and apparatus for processing biometric images

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090226052A1 (en) * 2003-06-21 2009-09-10 Vincent Fedele Method and apparatus for processing biometric images
US20050196047A1 (en) * 2004-02-03 2005-09-08 Yuri Owechko Object recognition system incorporating swarming domain classifiers
US20050259882A1 (en) * 2004-05-18 2005-11-24 Agfa-Gevaert N.V. Method for automatically mapping of geometric objects in digital medical images

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
CHEN ET AL.: "Multimodal Medical Image Registration Using Partiel Swarm Optimization. IEEE Computer Society.", IEEE COMPUTER SOCIETY, 2008, pages 127 - 131, XP031368603, Retrieved from the Internet <URL:http://www.iipl.is.ritsumei.ac.jp/2008/PSO_registration_Chen.pdf>> [retrieved on 20111104] *
WACHOWIAK ET AL.: "An Approach to Multimodal Biomedical Image Registration Utilizing Particle Swarm Optimization.", IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, vol. 8, no. 3, June 2004 (2004-06-01), pages 289 - 301, XP011113794, Retrieved from the Internet <URL:http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.3.2918&rep=rep1&type=pdf> [retrieved on 20111104] *
ZHANG ET AL.: "Multi-Object Tracking via Species Based Particle Swarm Optimization.", IEEE 12TH INTERNATIONAL CONFERENCE ON COMPUTER VISION WORKSHOPS., 2009, pages 1105 - 1112, XP031664594, Retrieved from the Internet <URL:http://www.nlpr.ia.ac.cn/2009papers/gjhy/gh51.pdf> [retrieved on 20111104] *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014014647A (ja) * 2012-07-09 2014-01-30 Toshiba Corp 医用画像処理装置及び医用画像処理プログラム
JPWO2014027522A1 (ja) * 2012-08-17 2016-07-25 ソニー株式会社 画像処理装置、画像処理方法、プログラムおよび画像処理システム
US9569850B2 (en) 2013-10-16 2017-02-14 Cognex Corporation System and method for automatically determining pose of a shape
JP2016174671A (ja) * 2015-03-19 2016-10-06 株式会社ヒューマン・エンジニアリング 判定装置および判定プログラム
US10643332B2 (en) 2018-03-29 2020-05-05 Uveye Ltd. Method of vehicle image comparison and system thereof
US10650530B2 (en) 2018-03-29 2020-05-12 Uveye Ltd. Method of vehicle image comparison and system thereof
CN118350406A (zh) * 2024-06-17 2024-07-16 中国华西工程设计建设有限公司 一种软土区路基沉降预测方法及系统

Similar Documents

Publication Publication Date Title
US20120020573A1 (en) Image analysis systems using non-linear data processing techniques and methods using same
US20130188878A1 (en) Image analysis systems having image sharpening capabilities and methods using same
Liu et al. Deformable registration of cortical structures via hybrid volumetric and surface warping
WO2012012576A1 (fr) Systèmes d&#39;analyse d&#39;images utilisant des techniques de traitement de données non linéaires et leurs procédés d&#39;utilisation
US20090010507A1 (en) System and method for generating a 3d model of anatomical structure using a plurality of 2d images
CN107007267A (zh) 用于分析热图像的方法、设备和系统
JP6273291B2 (ja) 画像処理装置および方法
CN108830852B (zh) 三维超声肿瘤辅助测量系统及方法
WO2012067648A1 (fr) Système d&#39;acquisition, de stockage et d&#39;évaluation de données de surface
EP1844440A1 (fr) Procede et progiciel pour enregistrer des images biomedicales a artefacts d&#39;imagerie reduits, dus aux mouvements d&#39;objets
Kretschmer et al. ADR-anatomy-driven reformation
Casti et al. Automatic detection of the nipple in screen-film and full-field digital mammograms using a novel Hessian-based method
Unger et al. Method for accurate registration of tissue autofluorescence imaging data with corresponding histology: a means for enhanced tumor margin assessment
WO2013070945A1 (fr) Systèmes d&#39;analyse d&#39;image à capacités d&#39;accentuation des contours d&#39;image et procédés les utilisant
Alam et al. Evaluation of medical image registration techniques based on nature and domain of the transformation
US8577101B2 (en) Change assessment method
Landi et al. Applying geometric morphometrics to digital reconstruction and anatomical investigation
Tan et al. Chest wall segmentation in automated 3D breast ultrasound scans
EP3044761B1 (fr) Procédé et appareil
KR102373987B1 (ko) 알츠하이머 병 및 정상 노화에서 템플릿 기반 해마 서브필드 위축 분석 방법
Davatzikos Measuring biological shape using geometry-based shape transformations
Afzali et al. Inter-patient modelling of 2D lung variations from chest X-ray imaging via Fourier descriptors
EP3624058A1 (fr) Procédé et système d&#39;analyse de symétrie à partir de données d&#39;image
Roy et al. A useful approach towards 3D representation of brain abnormality from its 2D MRI slides with a volumetric exclamation
Mosaliganti et al. An imaging workflow for characterizing phenotypical change in large histological mouse model datasets

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11810372

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2013520848

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2011810372

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2011281065

Country of ref document: AU

Date of ref document: 20110720

Kind code of ref document: A