WO2025017545A1 - Mammography image display system, method and computer program product configured to display a synthetic image which simulates a with-contrast image - Google Patents
Mammography image display system, method and computer program product configured to display a synthetic image which simulates a with-contrast image Download PDFInfo
- Publication number
- WO2025017545A1 WO2025017545A1 PCT/IL2024/050537 IL2024050537W WO2025017545A1 WO 2025017545 A1 WO2025017545 A1 WO 2025017545A1 IL 2024050537 W IL2024050537 W IL 2024050537W WO 2025017545 A1 WO2025017545 A1 WO 2025017545A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- images
- image
- contrast
- mammography
- simulations
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/90—Dynamic range modification of images or parts thereof
- G06T5/92—Dynamic range modification of images or parts thereof based on global image properties
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/60—Image enhancement or restoration using machine learning, e.g. neural networks
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01R—MEASURING ELECTRIC VARIABLES; MEASURING MAGNETIC VARIABLES
- G01R33/00—Arrangements or instruments for measuring magnetic variables
- G01R33/20—Arrangements or instruments for measuring magnetic variables involving magnetic resonance
- G01R33/44—Arrangements or instruments for measuring magnetic variables involving magnetic resonance using nuclear magnetic resonance [NMR]
- G01R33/48—NMR imaging systems
- G01R33/54—Signal processing systems, e.g. using pulse sequences ; Generation or control of pulse sequences; Operator console
- G01R33/56—Image enhancement or correction, e.g. subtraction or averaging techniques, e.g. improvement of signal-to-noise ratio and resolution
- G01R33/5608—Data processing and visualization specially adapted for MR, e.g. for feature analysis and pattern recognition on the basis of measured MR data, segmentation of measured MR data, edge contour detection on the basis of measured MR data, for enhancing measured MR data in terms of signal-to-noise ratio by means of noise filtering or apodization, for enhancing measured MR data in terms of resolution by means for deblurring, windowing, zero filling, or generation of gray-scaled images, colour-coded images or images displaying vectors instead of pixels
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30068—Mammography; Breast
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30096—Tumor; Lesion
Definitions
- the present invention relates generally to medical imagery, and more particularly to use of contrast agents in medical imagery.
- Contrast-enhanced mammography aka contrast-enhanced spectral mammography (CESM) aka contrast-enhanced digital mammography (CEDM), administers contrast e.g., iodinated intravenous (IV) contrast before performing a mammogram.
- CEM is considered a possible test for breast lesions assessment inter alia when mammography and ultrasound are not conclusive or MRI is contraindicated or not available, e.g., as described here: https ://www.sciencedirect.com/science/article/pii/S0960977620301326
- Contrast CT aka contrast enhanced computed tomography
- radiocontrast which is generally iodine-based, to highlight anatomy e.g. blood vessels that, absent a contrast agent, are hard to distinguish from their surroundings.
- a generative adversarial network typically comprises two neural networks: a generator which learns to generate plausible “fake” or simulated or synthetic data, and a discriminator e.g., classifier.
- the discriminator may have any network architecture, typically depending on the type of data which is being classified.
- the instances provided by the generator serve as negative training examples for the discriminator which learns to distinguish the fake data, provided by the generator, from real data.
- the discriminator penalizes the generator when the generator produces results which are implausible.
- generator output is directly connected to discriminator input.
- B ackpropagation involves a signal provided by the discriminator’s classification to the generator, which, responsively, update its weights.
- the discriminator connects to two loss functions (generator loss and discriminator loss). During its own training, the discriminator ignores generator loss and uses discriminator loss. Typically, during discriminator training, the discriminator classifies real data and fake data, and the discriminator loss penalizes or punishes the discriminator for classifying real data as fake, or vice versa. The discriminator updates its weights through backpropagation from the discriminator loss via the discriminator network.
- the generator loss is used.
- GAN training proceeds in alternating periods; thus, the discriminator may train for one or more epochs, followed by epoch/s of generator training, and so forth, until the generator and discriminator networks are trained.
- Histogram equalization is a known image processing method used to improve image contrast.
- SPADE Spatially-Adaptive Normalization
- Semantic segmentation is known and is described online e.g. here: superannotate.com/blog/guide-to-semantic-segmentation.
- Certain embodiments seek to provide a mammography image display system which is configured to display a synthetic mammogram of a breast as though the image had been captured after administration e.g., injection of a dosage D of a contrast agent A.
- the synthetic mammogram may be generated by a system trained e.g., as per herein, from a mammogram captured without administering a contrast agent, or captured after administering a different contrast agent (other than A), or captured after administering a dosage d ⁇ D of contrast agent A.
- Certain embodiments seek to digitally clean a mammography image.
- Certain embodiments seek to use image processing to facilitate decision-making by humans which conventionally relies at last partly on examining contrast-enhanced medical imagery.
- image processing to facilitate decision-making by humans which conventionally relies at last partly on examining contrast-enhanced medical imagery.
- the terms “contrast-enhanced” and “with-iodine” and “with-contrast” may be interchanged here within.
- Embodiments herein are described in terms of mammography of a breast by way of example; more generally, any reference herein to a breast may be replaced by a reference to any other body portion and any reference to mammography may be replaced by a reference to any other imaging technology which employs a contrast agent.
- circuitry typically comprising at least one processor in communication with at least one memory, with instructions stored in such memory executed by the processor to provide functionalities which are described herein in detail. Any functionality described herein may be firmware- implemented or processor-implemented, as appropriate.
- any reference herein to, or recitation of, an operation being performed is, e.g., if the operation is performed at least partly in software, intended to include both an embodiment where the operation is performed in its entirety by a server A, and also to include any type of “outsourcing” or “cloud” embodiments in which the operation, or portions thereof, is or are performed by a remote processor P (or several such), which may be deployed off-shore or “on a cloud”, and an output of the operation is then communicated to, e.g., over a suitable computer network, and used by, server A.
- the remote processor P may not, itself, perform all of the operations, and, instead, the remote processor P itself may receive output/s of portion/s of the operation from yet another processor/s P', may be deployed off-shore relative to P, or “on a cloud”, and so forth.
- Embodiment 1 A method for providing decision support e.g. to human experts examining medical images, the method comprising: using a hardware processor to generate simulations of with-contrast images; and/or using an image display system to display the simulations of with-contrast images.
- Embodiment 2 The method of any of the preceding embodiments wherein the images comprise mammography images and the image display system comprises a mammography image display system or mammography monitor which may use DICOM Image Format.
- Embodiment 3 The method of any of the preceding embodiments wherein the mammography images comprise 2D mammography images.
- Embodiment 4 The method of any of the preceding embodiments wherein the mammography images comprise 3D mammography images.
- Embodiment 5 The method of any of the preceding embodiments wherein the with-contrast images are generated after first administering iodine to a subject whose body is being imaged.
- Embodiment 6 The method of any of the preceding embodiments and also comprising administering the iodine to the subject whose body is being imaged.
- Embodiment 7 The method according to any of the preceding embodiments wherein a generative neural network is used to generate the simulations.
- Embodiment 8 The method according to any of the preceding embodiments wherein the generative neural network is trained on image pairs including a first image of a given body portion which is not enhanced with any contrast agent, and a second image of the given body portion which is with-contrast, wherein the with-contrast image is captured by capturing an image of the body portion after a contrast agent, e.g., dye has been administered to the patient.
- a contrast agent e.g., dye has been administered to the patient.
- Embodiment 9 The method of any of the preceding embodiments wherein the with-contrast image is captured by performing mammography on a patient to whom contrast dye has been administered.
- Embodiment 10 A decision support system serving human experts examining medical images, the system comprising: a hardware processor generating simulations of with-contrast images; and a computer display receiving the simulations of with-contrast images and displaying the simulations to at least one human expert, thereby to provide decision support to the human expert.
- Embodiment 11 The method of any of the preceding embodiments wherein the images comprise ultrasound images.
- Embodiment 12 The method of any of the preceding embodiments wherein the images comprise CT images.
- Embodiment 13 The method of any of the preceding embodiments wherein the images comprise MRI images.
- Embodiment 14 The method of any of the preceding embodiments wherein the simulations of with-contrast images are generated computationally rather than by actually administering a contrast agent to a patient, and subsequently capturing an image of a body part of interest.
- Embodiment 15 The method of any of the preceding embodiments wherein the simulations of the with-contrast images are generated from without-contrast images.
- Embodiment 16 The method of any of the preceding embodiments wherein the simulations of the with-contrast images simulate images captured after administration of a first dosage D of a contrast agent, and wherein the simulations of the with-contrast images are generated from and/or trained on images captured after administration of a second dosage d ⁇ D of the contrast agent. It is appreciated that the simulations of the with-contrast images may simulate images captured after administration of a first contrast agent and may be generated from and/or trained on images captured after administration of a second (different) contrast agent.
- Embodiment 17 A system comprising at least one hardware processor configured to carry out the operations of any of the methods shown and described herein.
- Embodiment 18 A computer program product, comprising a non-transitory tangible computer readable medium having computer readable program code embodied therein, the computer readable program code adapted to be executed to implement a method for providing decision support to human experts examining medical images, the method comprising: using a hardware processor to generate simulations of with-contrast images; and using an image display system to display the simulations of with-contrast images.
- Embodiment 19 The method of any of the preceding embodiments wherein the images comprise tomography images.
- a computer program comprising computer program code means for performing any of the methods shown and described herein when the program is run on at least one computer; and a computer program product, comprising a typically non-transitory computer-usable or -readable medium e.g. non-transitory computer -usable or -readable storage medium, typically tangible, having a computer readable program code embodied therein, the computer readable program code adapted to be executed to implement any or all of the methods shown and described herein.
- the operations in accordance with the teachings herein may be performed by at least one computer specially constructed for the desired purposes, or a general purpose computer specially configured for the desired purpose by at least one computer program stored in a typically non-transitory computer readable storage medium.
- the term "non-transitory” is used herein to exclude transitory, propagating signals or waves, but to otherwise include any volatile or non-volatile computer memory technology suitable to the application.
- processor/s, display and input means may be used to process, display, e.g., on a computer screen or other computer output device, store, and accept information such as information used by or generated by any of the methods and apparatus shown and described herein; the above processor/s, display and input means including computer programs, in accordance with all or any subset of the embodiments of the present invention.
- any or all functionalities of the invention shown and described herein, such as but not limited to operations within flowcharts, may be performed by any one or more of: at least one conventional personal computer processor, workstation or other programmable device or computer or electronic computing device or processor, either general-purpose or specifically constructed, used for processing; a computer display screen and/or printer and/or speaker for displaying; machine -readable memory such as flash drives, optical disks, CDROMs, DVDs, BluRays, magnetic-optical discs or other discs; RAMs, ROMs, EPROMs, EEPROMs, magnetic or optical or other cards, for storing, and keyboard or mouse for accepting.
- at least one conventional personal computer processor, workstation or other programmable device or computer or electronic computing device or processor either general-purpose or specifically constructed, used for processing
- a computer display screen and/or printer and/or speaker for displaying
- machine -readable memory such as flash drives, optical disks, CDROMs, DVDs, BluRays, magnetic-optical discs or other discs
- Modules illustrated and described herein may include any one or combination or plurality of: a server, a data processor, a memory/computer storage, a communication interface (wireless (e.g., BLE) or wired (e.g., USB)), a computer program stored in memory/computer storage.
- a server e.g., a data processor
- a memory/computer storage e.g., a hard disk drive
- a communication interface e.g., BLE
- wired e.g., USB
- processor is intended to include any type of computation or manipulation or transformation of data represented as physical, e.g. electronic, phenomena which may occur or reside, e.g., within registers and /or memories of at least one computer or processor.
- processor is intended to include a plurality of processing units which may be distributed or remote
- server is intended to include plural typically interconnected modules running on plural respective servers, and so forth.
- the above devices may communicate via any conventional wired or wireless digital communication means, e.g., via a wired or cellular telephone network or a computer network such as the Internet.
- the apparatus of the present invention may include, according to certain embodiments of the invention, machine readable memory containing or otherwise storing a program of instructions which, when executed by the machine, implements all or any subset of the apparatus, methods, features, and functionalities of the invention shown and described herein.
- the apparatus of the present invention may include, according to certain embodiments of the invention, a program as above which may be written in any conventional programming language, and optionally a machine for executing the program, such as but not limited to a general-purpose computer which may optionally be configured or activated in accordance with the teachings of the present invention. Any of the teachings incorporated herein may, wherever suitable, operate on signals representative of physical objects or substances.
- terms such as, “processing”, “computing”, “estimating”, “selecting”, “ranking”, “grading”, “calculating”, “determining”, “generating”, “reassessing”, “classifying”, “generating”, “producing”, “stereo-matching”, “registering”, “detecting”, “associating”, “superimposing”, “obtaining”, “providing”, “accessing”, “setting” or the like refer to the action and/or processes of at least one computer/s or computing system/s, or processor/s or similar electronic computing device/s or circuitry, that manipulate and/or transform data which may be represented as physical, such as electronic, quantities, e.g., within the computing system's registers and/or memories, and/or may be provided on-the-fly, into other data which may be similarly represented as physical quantities within the computing system's memories, registers or other such information storage, transmission or display devices or may be provided to external factors, e.g., via a suitable data network.
- the term “computer” should be broadly construed to cover any kind of electronic device with data processing capabilities, including, by way of non-limiting example, personal computers, servers, embedded cores, computing system, communication devices, processors (e.g., digital signal processor (DSP), microcontrollers, field programmable gate array (FPGA), application specific integrated circuit (ASIC), etc.) and other electronic computing devices.
- processors e.g., digital signal processor (DSP), microcontrollers, field programmable gate array (FPGA), application specific integrated circuit (ASIC), etc.
- Any reference to a computer, controller, or processor is intended to include one or more hardware devices e.g., chips, which may be co-located or remote from one another.
- Any controller or processor may, for example, comprise at least one CPU, DSP, FPGA or ASIC, suitably configured in accordance with the logic and functionalities described herein.
- processor/s or controller/s configured as per the described feature or logic or functionality, even if the processor/s or controller/s are not specifically illustrated for simplicity.
- the controller or processor may be implemented in hardware, e.g., using one or more Application-Specific Integrated Circuits (ASICs) or Field-Programmable Gate Arrays (FPGAs), or may comprise a microprocessor that runs suitable software, or a combination of hardware and software elements.
- ASICs Application-Specific Integrated Circuits
- FPGAs Field-Programmable Gate Arrays
- an element or feature may exist is intended to include (a) embodiments in which the element or feature exists; (b) embodiments in which the element or feature does not exist; and (c) embodiments in which the element or feature exist selectably, e.g., a user may configure or select whether the element or feature does or does not exist.
- Any suitable input device such as but not limited to a sensor, may be used to generate or otherwise provide information received by the apparatus and methods shown and described herein.
- Any suitable output device or display may be used to display or output information generated by the apparatus and methods shown and described herein.
- Any suitable processor/s may be employed to compute or generate or route, or otherwise manipulate or process information as described herein and/or to perform functionalities described herein and/or to implement any engine, interface or other system illustrated or described herein.
- Any suitable computerized data storage e.g., computer memory, may be used to store information received by or generated by the systems shown and described herein.
- Functionalities shown and described herein may be divided between a server computer and a plurality of client computers. These or any other computerized components shown and described herein may communicate between themselves via a suitable computer network.
- the system shown and described herein may include user interface/s e.g. as described herein, which may, for example, include all or any subset of: an interactive voice response interface, automated response tool, speech-to-text transcription system, automated digital or electronic interface having interactive visual components, web portal, visual interface loaded as web page/s or screen/s from server/s via communication network/s to a web browser or other application downloaded onto a user's device, automated speech-to-text conversion tool, including a front-end interface portion thereof and back-end logic interacting therewith.
- user interface/s e.g. as described herein, which may, for example, include all or any subset of: an interactive voice response interface, automated response tool, speech-to-text transcription system, automated digital or electronic interface having interactive visual components, web portal, visual interface loaded as web page/s or screen/s from server/s via communication network/s to a web browser or other application downloaded onto a user's device, automated speech-to-text conversion tool, including a front-end interface portion thereof and back-
- UI user interface
- the term user interface or “UI” as used herein includes also the underlying logic which controls the data presented to the user e.g., by the system display and receives and processes and/or provides to other modules herein, data entered by a user e.g., using her or his workstation/device.
- arrows between modules may be implemented as APIs and any suitable technology may be used for interconnecting functional components or modules illustrated herein in a suitable sequence or order e.g. via a suitable A Pl/In terface.
- state of the art tools may be employed, such as but not limited to Apache Thrift and Avro which provide remote call support.
- a standard communication protocol may be employed, such as but not limited to HTTP or MQTT, and may be combined with a standard data format, such as but not limited to JSON or XML.
- one of the modules may share a secure API with another. Communication between modules may comply with any customized protocol or customized query language or may comply with any conventional query language or protocol.
- Fig. 1 is a simplified flowchart illustration of a method for presenting, e.g., to a human user, simulations of with-contrast images of a patient’s body part generated from without-contrast images of the patient’s body part.
- this method and all others herein all or any subset of the illustrated operations may be used, in any suitable order e.g., as shown.
- Fig. 2 is a simplified flowchart illustration of a method for preprocessing images in the dataset of Fig. 1 (or Fig. 3).
- Fig. 3 is a simplified flowchart illustration of a method for presenting, e.g., to a human user, simulations of dose D-contrast images of a patient’ s body part generated from images of the body part captured after administration of a dose d ⁇ D to the patient.
- Methods and systems included in the scope of the present invention may include any subset or all of the functional blocks shown in the specifically illustrated implementations by way of example, in any suitable order, e.g., as shown.
- Flows e.g. those of Figs. 1, 2 and 3 may include all or any subset of the illustrated operations, suitably ordered e.g., as shown.
- Tables herein may include all or any subset of the fields and/or records and/or cells and/or rows and/or columns described.
- Computational, functional, or logical components described and illustrated herein can be implemented in various forms, for example, as hardware circuits, such as but not limited to custom VLSI circuits or gate arrays or programmable hardware devices such as but not limited to FPGAs, or as software program code stored on at least one tangible or intangible computer readable medium and executable by at least one processor, or any suitable combination thereof.
- a specific functional component may be formed by one particular sequence of software code, or by a plurality of such, which collectively act or behave or act as described herein with reference to the functional component in question.
- the component may be distributed over several code sequences such as but not limited to objects, procedures, functions, routines and programs, and may originate from several computer files which typically operate synergistically.
- Each functionality or method herein may be implemented in software (e.g., for execution on suitable processing hardware such as a microprocessor or digital signal processor), firmware, hardware (using any conventional hardware technology such as Integrated Circuit technology), or any combination thereof.
- modules or functionality described herein may comprise a suitably configured hardware component or circuitry.
- modules or functionality described herein may be performed by a general purpose computer, or more generally by a suitable microprocessor, configured in accordance with methods shown and described herein, or any suitable subset, in any suitable order, of the operations included in such methods, or in accordance with methods known in the art.
- Any logical functionality described herein may be implemented as a real time application, if and as appropriate, and which may employ any suitable architectural option such as but not limited to FPGA, ASIC, or DSP, or any suitable combination thereof.
- Any hardware component mentioned herein may in fact include either one or more hardware devices e.g., chips, which may be co-located or remote from one another.
- Any method described herein is intended to include, within the scope of the embodiments of the present invention, also any software or computer program performing all or any subset of the method’s operations, including a mobile application, platform, or operating system, e.g., as stored in a medium, as well as combining the computer program with a hardware device to perform all or any subset of the operations of the method.
- Data can be stored on one or more tangible or intangible computer readable media stored at one or more different locations, different network nodes or different storage devices at a single node or location.
- Suitable computer data storage or information retention apparatus may include apparatus which is primary, secondary, tertiary or off-line; which is of any type or level or amount or category of volatility, differentiation, mutability, accessibility, addressability, capacity, performance, and energy use; and which is based on any suitable technologies such as semiconductor, magnetic, optical, paper, and others.
- Contrast-enhanced mammography is considered useful for all or any subset of the following reasons:
- the system of the present invention may include a hardware processor which is configured to generate a synthetic image that simulates mammography with use of a contrast agent and/or a platform that makes the synthetic image accessible to personnel, e.g., radiologists ,etc. e.g., to allow the presentation a cancerous/benign score to be assigned to any lesions observed in the synthetic image; the radiologist may then make a decision as to whether a biopsy or additional tests are necessary, and/or this may be determined as a function of (at least) the cancerous/benign score.
- a hardware processor which is configured to generate a synthetic image that simulates mammography with use of a contrast agent and/or a platform that makes the synthetic image accessible to personnel, e.g., radiologists ,etc. e.g., to allow the presentation a cancerous/benign score to be assigned to any lesions observed in the synthetic image; the radiologist may then make a decision as to whether a biopsy or additional tests are necessary, and/or this may
- the system accepts a marking indicating a location of putative lesionon the synthetic image and displays a score for that lesion where the score represents a recommendation whether to send the lesion for biopsy, or not based on a trained model which may accept the synthetic images as an input.
- Certain embodiments seek to provide a system, computer program product or method for decision support e.g. for diagnosing cancer such as but not limited to breast cancer, the method comprising: generating simulations of contrast-enhanced mammography images from non-enhanced mammography images; and/or displaying the simulations of contrast-enhanced mammography images to at least one human expert, thereby to facilitate generation of breast cancer (say) diagnostic outputs by the human expert.
- contrast-enhanced images are typically generated computationally e.g., as described herein, rather than actually administering a contrast agent to a patient and subsequently capturing an image of a body part of interest.
- Such simulated images are also termed herein “artificial” or “synthetic”.
- the system may use GAN to generate the simulations.
- Training the GAN may include training a generator to generate synthetic contrast-enhanced mammography images of respective breasts (or any other body portion) from real, no-contrast images of the same breasts; and/or training a discriminator to distinguish between real contrast-enhanced mammography images of breasts and synthetic images generated by the generator for the same breasts; and/or using outputs of the discriminator to refine the generator, e.g., as in conventional GAN.
- Fig. 1 is a simplified flowchart illustration of a method according to certain embodiments; all or any subset of the illustrated operations may be performed, in any suitable order e.g., as shown.
- the method typically includes offline operations and runtime operations.
- the offline or setup operations may include:
- Operation 10 Provide a dataset of pairs of mammography images where each pair typically includes a first (aka without-iodine), e.g., Low Energy, Mammography image of a breast captured without or before iodine injection and a second (aka with- iodine), e.g., Contrast-enhanced mammography (CEM), Mammography image of the same breast after i.e., with iodine injection.
- a first aka without-iodine
- a second aka with- iodine
- CEM Contrast-enhanced mammography
- each image is in DICOM Image Format.
- Operation 20 Preprocess each image in the dataset, e.g., by performing all or any subset of the operations in Fig. 2, in any suitable order e.g., as shown.
- Operation 30 Using pairs from the dataset provided in operation 10 as a training set, train a first, “no iodine, expert-decision supporting” generator network (e.g., of type UNET, UNET++, Attention, Transformer, Diffusion) to convert a without-iodine Mammography image of a breast (aka input image) into a with-iodine Mammography image (aka output image, synthetic image) of the same breast.
- the runtime operations in Fig. 1, which are also termed herein ACEM (artificial contrast-enhanced mammography) operations when applied by way of example to the field of imaging breasts, may include:
- Operation 40 Use the generator network trained in operation 30, to convert a without-iodine image of a breast (say) captured from a patient into a synthetic with-iodine image of the same breast.
- Operation 50 Use a mammography (say) image display system to display the synthetic with-iodine image e.g., to a human expert, e.g., to facilitate the human expert’s decision- making regarding the patient.
- a mammography (say) image display system to display the synthetic with-iodine image e.g., to a human expert, e.g., to facilitate the human expert’s decision- making regarding the patient.
- decision-making regarding patients may be enhanced by machinelearning the human expert’s decisions.
- This may, for example, include training a second, “no iodine, no expert” generator network to convert a without-iodine image into a synthetic human expert’s decision.
- human expert decisions for each without-iodine image may be recorded, using a suitable user interface into which the human expert enters his decisions, and may be stored in memory in association with the without-iodine image.
- a training set may be generated accordingly, and a generator network may be trained on this training set.
- the second, “no iodine, no expert” generator network may be used to automatically generate the decision about the patient, without generating a synthetic, with-iodine image of the patient’s breast, and without resorting to a human expert.
- a tissue mask e.g. as created for a first input image I by the method of Fig. 2, may be used to filter out portions of the input image I that do not carry relevant information, using any suitable criterion to determine relevance.
- the mask is typically created in advance, e.g., upon receiving the image at the entrance of the generator, to save time in training, relative to creating the same mask in each iteration.
- the mask may be applied to the image at any suitable point in the course of the method.
- the input to the generator network in each iteration may either be the raw image as captured, or the masked image.
- tissue region mask typically results in the network disregarding or ignoring irrelevant portions of the image when creating each artificial CEM.
- the lesion location mask e.g., as created for a first input image I by the method of Fig. 2, may be used, e.g., to reduce error and/or increase precision in the area of the lesion.
- tissue region mask may concentrate creation of the synthetic image on the relevant part of the image e.g. by causing the generator network to invest computational resources in image portions where it is known in advance that there is no tissue.
- the lesion location mask may be used to direct maximum attention to the image portion which putatively includes the lesion itself, which is the main area of interest.
- an externally trained network e.g., VGG19
- VGG19 feature extraction technique may be used while training the generator to punish or penalize the generator when it produces an image that is not similar to the CSEM the generator is aiming to create.
- an alignment network e.g., Optical Flow
- Optical Flow may be used to refine or reduce misalignment between the first and second images in any given pair in the dataset due to slight movement of the subject after the first image in the pair was captured and before and/or while the second image is being captured.
- a first mammogram image e.g., captured with or without iodine
- iodine iodine
- the resulting synthetic mammography image (e.g., as generated by the system from a without-iodine image) is desired to be as identical as possible, to the image actually physically captured under iodine injection.
- the synthetic image may, e.g., in each training iteration, be compared, pixel by pixel, to an image under real iodine injection.
- the alignment network may detect a shift each time the original image, captured with or without iodine, shifts a little, and, accordingly, fix the image, by translating until the shifted image matches the network's output image.
- a trained discriminator network may be used to enhance the generator network, e.g., by applying pressure thereto, e.g., using any suitable conventional GAN (generative adversarial network) technique.
- GAN generative adversarial network
- Such a discriminator may be trained to receive triplets of images, each triplet including a first image F from the dataset, and 2 additional images including Al, the second image paired to the first image in the dataset and A2, an output image generated by feeding the first image F to the generator network, and to determine which of the two additional images is Al and which is A2, and/or to output a probability that (without loss of generality) the first of the two additional images is Al and/or a probability that the first of the two additional images is A2.
- the second image in the triplet is Al and the third image is A2
- the second image in the triplet is A2 and the third image is Al; this may be determined randomly.
- training may employ any suitable Generator Network typically comprising an encoder, decoder, and intermediate skip connections.
- the network input may comprise Low-Energy mammography images and may generate an output image that is forced to resemble a contrast-enhanced mammography (CESM) image. Images may be streamed to the network iteratively from the dataset.
- the constraint function’s type may
- Y1 be, say, LI, L2, KLD, SSIM, etc.
- External constraints which may be used include perceptual loss and/or use of a variational autoencoder.
- An alignment network may be used to refine mismatch between the network’s target image, which corresponds to the input Low-Energy image, and a target which is slightly different (e.g. due to slight movement of the subject in the source image - CESM).
- an optimizer (of a suitable type e.g. ADAM, SGD, RMSPROP) may be employed to adjust the network's weights to achieve better results in the next iteration.
- Training may employ any suitable Discriminator Network e.g.
- an encoder network that learns to distinguish between real contrast-enhanced mammography images and synthetic images generated by the generator network.
- the better the discriminator network distinguishes between the images the more the generator network strives to generate even more realistic images.
- Patch GAN (a known type of discriminator for generative adversarial networks) may be used to highlight different regions in the image.
- CESM patches may be cut and overlaid onto respective generated CESM breast images. Integration may for example occur at the original patch cutting location.
- Fig. 2 is a simplified flowchart illustration of a method configured to preprocess each image in the dataset, including both with-iodine images and without-iodine images, according to certain embodiments; all or any subset of the illustrated operations may be performed, in any suitable order, e.g., as shown.
- the method typically includes: a. Normalization, zero centering and standardization of images e.g., as described here: https://www.imaios.com/en/resources/blog/ct-images-normalization-zero-centering-and- standardization b.
- tissue region mask e.g., binary image
- tissue mask to differentiate the breast (e.g., “1” in the mask) from irrelevant portions of the image (e.g., “0” in the mask) such as the room in which imaging occurred or parts of the patient’s body other than the breast, such as the patient’s arm.
- Tissue region masks concentrate the synthetic image generation process on pertinent areas of interest (potentially, relevant anatomical structures) within the image rather than on, say, areas devoid of tissue, by guiding the generator network toward allocating lesser computational resources to regions devoid of tissue.
- a mask for the lesion location e.g. binary image
- lesion mask e.g., by having human experts such as radiologists examine the mammogram images in the dataset and mark lesions, if any, in the image e.g., by outlining the lesion with a marker.
- the mask may then be a binary image, the size of the image in the dataset, in which the putative lesion as identified by the radiologist is marked with 1 , and all other portions of the image are marked with 0.
- Spatially-Adaptive Normalization may be employed which, e.g. via integration of an affine layer learned from semantic segmentation maps, may facilitate dynamic modulation of activations in normalization layers.
- the spatially-adaptive, learned transformation leverages input layouts to tailor generator's output to anatomical feature/s, ensuring optimal synthesis fidelity.
- the image values are adjusted to normal values, so that the average of the entire image will be 0 and the standard deviation will be 1.
- the normalization for each lesion area will depend in the statistics that are unique to it, and not necessarily depending on the whole image as a whole.
- tissue mask and a lesion mask are provided (a tissue mask and a lesion mask), as opposed to a single mask differentiating the lesion area from all other portions of the image, since it is advantageous for the tissue mask to indicate to the computer or network that certain portions of the image are entirely irrelevant and may be ignored, when creating the artificial CSEM.
- the lesion mask is used to indicate to the computer or network that output for the area marked lesion is to be more detailed than output created for the non-lesion area, however the non-lesion areas of the breast tissue (unlike the nontissue areas) should not be entirely ignored.
- any suitable resizing algorithms may be used to efficiently produce an output image, such as but not limited to progressive growing or generation by parts accurate ACEM.
- Size adjustment may include all or any subset of maximum size usage, and/or Downsampling for resizing the image to a fixed size for all images (512X256, 2048X1024, etc.), padding with zeros to a fixed size, preserving high resolution by cropping the original image into patches, and using patches that contain only lesions.
- Any suitable Mammography Image Display System may be used to view mammography images, e.g. in DICOM Image Format such as, for example, a RadiForce mammography monitor e.g., as described here: https://www,eizo
- the discriminator is not used in runtime, and instead is used (as a constraint e.g.) to train the generator which, once trained, accepts images from patients and performs its function as per the method described herein in runtime.
- the discriminator can be used in real time as well, e.g., to ensure that the image that enters the generator is actually a mammogram; if an individual attempts to use the network on any other type of image, the discriminator may detect this.
- any suitable stopping criteria may be employed e.g., to determine when to terminate training. For example, human observers may be tasked with determining whether artificially generated with-iodine images do or do not seem real; if the former is the case, GAN training may be terminated.
- loss values may be examined (e.g., to determine whether iterations are producing only a small difference in loss values), as well as other training success metrics such as Inception Scores, Frechet Inception Distances (FID scores), or perceptual similarity measures (EPIPS), or any other GAN evaluation measure known in the art.
- the above quantitative metrics may be used to determine whether to impose early stopping e.g., whether to stop training when FID scores worsen or perceptual similarity is not improving over iterations.
- Expert decisions which may be supported using any embodiment herein may include all or any subset or any combination of the following:
- cancerous type of cancer e.g. invasive ductal carcinoma vs. invasive lobular carcinoma, stages of cancer e.g. breast cancer stages 0, 1, II, III, and IV.
- Is monitoring required at all - yes/no biopsy e.g., FNAC and/or core biopsy
- yes/no frequency of monitoring type of monitoring e.g., physical examination, mammography and/or ultrasound parameterization of monitoring e.g., re-doing mammography from different angle/s imagery after physically administering a contrast agent e.g., if there is any doubt regarding the validity of the synthetic imagery generated according to embodiments of the invention.
- Which treatment option e.g., none, surgery, chemotherapy, hormonal therapy, biological therapy, hormone therapy, targeted therapy, radiation therapy, a combination thereof.
- Characterization of a selected treatment option e.g., if surgery is the selected treatment option, complete mastectomy vs. partial mastectomy, segmental mastectomy, Lumpectomy or other breast-sparing surgery and/or choice of chemotherapeutic agent/s and/or number of treatments and/or length of chemotherapy cycle, e.g., once a week vs. once every three weeks.
- any type of monitoring and/or treatment may be performed on the patient, depending on the expert decision (whether obtained from a human or whether simulated using the second generative network described herein). It is appreciated that the system herein may be used in conjunction with Al-based commercial platforms which provide decision-support to radiologists tasked with interpreting mammography images, such as DENISE, developed in conjunction with Sheba Hospital , Tel haShomer, Israel.
- the system herein may be employed to generate an artificial CSEM or simulated with-contrast image as input to the DENISE, which can then provide decision support to radiologists or other medical professionals.
- the Sklair- Levy patent describes a system and method diagnosing breast cancer by acquiring a contrast enhanced region of interest (CE-ROI) which may be comprised in an X-ray image of a patient's breast, the X-ray image comprising X-ray pixels that indicate intensity of X- rays that passed through the breast to generate the image; generating a texture feature vector (TF) having components based on the indications of intensity provided by a plurality of X- ray pixels in the CE-ROI; and using a classifier to classify the texture feature vector TF to determine whether the CE-ROI is malignant.
- the contrast enhanced region of interest may be generated using any embodiment herein, rather than by actually administering a contrast agent to the patient and subsequently imaging the region of interest.
- GANs Generative Adversarial Networks
- CGAN Conditional Gan
- DCGAN Deep Convolutional GAN
- Wasserstein GANs WGANs
- WGAN-LPs e.g. as described herein https://wasachenev.github.io/publication/2018-12-01-lpwgan ).
- GANs Generative Adversarial Networks
- CNNs Convolutional Neural Networks
- diffusion models any suitable alternative may be used such as but not limited to Convolutional Neural Networks (CNNs), transformers, and diffusion models.
- CNNs Convolutional Neural Networks
- the system herein is applicable both to 2D mammography and to 3D mammography. During 2D (aka conventional digital) mammography, each breast is typically imaged twice, one from the side, and one from above (without and perhaps also with iodine).
- 3D mammography aka digital breast tomosynthesis
- more than 2 images without and perhaps also with iodine
- the images are then typically combined to yield a 3D image or model of the breast.
- pairs of images captured of a given breast from a given angle, with and without iodine may be used to train the system herein, and once the system is trained, the system may derive a simulated with-contrast image from a without-contrast image provided.
- sets of n image pairs may be used to train a GAN to generate a 3D with- contrast image or model of a breast from n such image pairs.
- pairs of 3D breast models, with and without contrast may be used to train a GAN to generate a 3D with-contrast image or model of a breast from a 3D without-contrast image or model of a breast.
- the system herein is applicable not only to mammography but to other scans of body parts other than the breast, which may be scanned using any suitable technology e.g., low energy scans or CSEM or tomography (e.g. 2D or 3D tomography).
- CSEM low energy scans
- tomography e.g. 2D or 3D tomography
- matching low-energy CT (LECT) images may be used to train a system to receive a low-energy CT (LECT) image without contrast, and to generate therefrom a with-contrast low-energy CT (LECT) image.
- references to mammography images in the description of Figs. 1 and 2 are merely by way of example ; alternatively, ultrasound images or tomography e.g. CT images (including low energy CT) or MRI images may be employed.
- CT images including low energy CT
- MRI images may be employed.
- Fig. 2 is but one possible method for preprocessing images with and/or without contrast agent in the dataset; any other suitable method may alternatively be used.
- references to iodine here are also merely exemplary ; alternatively, any suitable contrast material other than iodine may be employed.
- references to « with iodine » images herein may more generally refer to any contrast-enhanced image.
- the system may be configured to reduce use of a contrast agent e.g., iodine, when capturing medical imagery.
- a system may receive an image generated using a first (“low”) dose of iodine (or any other contrast agent), and may generate a synthetic image which simulates the image that would result given a second (“high”) dose of iodine, which is higher than the first dose.
- the high dose may be that conventionally used, e.g., 1.5 Ml per kilogram of body weight and the low dose may be some percentage of that e.g., less than 5%, 10%, 25%, 33%, 50%, or 75%, or more.
- the system may be trained using sets of images each obtained from a given patient including one image of a body part e.g., the patient’s breast imaged after a low dose of iodine, and another image of the same patient’ s same body part after a high dose of iodine.
- the system may simulate the high dose image using a without-iodine image as input for some patients, and may simulate the high dose image using a low-dose iodine image as input for other patients, e.g., patients for whom there is an a priori reason (e.g., a certain range of breast denseness) to be concerned that a synthetic high-dose image generated based only on a without-iodine image will be insufficiently accurate.
- a priori reason e.g., a certain range of breast denseness
- preprocessing may include all or any subset of the following Preliminary Image Processing operations e.g. if images have disparate sources (different machines, different technicians) and/or have undergone preliminary processing (may be preserved as source images after processing) and/or if images have disparate sizes: a. Histogram Equalization after normalization based on internal DICOM tags b.
- Cutting or partitioning images into patches Low Energy and/or CESM images may be cut into patches which may be square. Patches may all be the same standardized size e.g., say, 800x800 pixels. Each patch may be centered around a single lesion within the image.
- Creating mask/s for tissue region /s and/or for lesion location/s may be used for saving e.g. saving as uint8, uintl6, float.
- the input image (e.g. of Fig. 1 or Fig. 3) may comprise Eow Energy patches multiplied by a mask, to cancel everything outside the lesion area.
- “Multiplication” may refer to matrix multiplication where the matrices comprise the multi-level pixel array of the input image and the binary pixel array of the mask respectively.
- the output image (e.g. of Fig. 1 or Fig. 3) may comprise CESM patches multiplied by the mask, canceling everything outside the lesion area.
- the input image may comprise, say, a without-iodine Mammography image of a breast and the output image may comprise, say, a with-iodine Mammography image of the same breast.
- the input image may comprise, say, a low-iodine-dose image of a breast and the output image may comprise, say, a high-iodine-dose image of the same breast.
- Contrast administration may require the patient to fast for a certain period of time.
- patients may be required to drink 6 - 8 glasses of water within 24 hours to flush the contrast agent out of the body.
- CEMs expose patients to slightly more radiation than mammograms.
- some patients have allergic reactions to intravenous contrast agents. Some of these reactions are severe, e.g., difficulty in breathing.
- intravenous contrast agents sometimes affect patients’ kidneys.
- contrast is typically not used for patients with a history of serious allergic reactions to iodine, or for kidney patients.
- Contrast-enhanced mammography CEM is also considered unsafe for women who are either pregnant or breastfeeding.
- Embodiments herein by simulating contrast administration, provide images with greater clarity than no-contrast images, yet without the above drawbacks.
- Each module or component or processor may be centralized in a single physical location or physical device, or distributed over several physical locations or physical devices.
- electromagnetic signals in accordance with the description herein. These may carry computer-readable instructions for performing any or all of the operations of any of the methods shown and described herein, in any suitable order, including simultaneous performance of suitable groups of operations as appropriate. Included in the scope of the present disclosure, inter alia, are machine -readable instructions for performing any or all of the operations of any of the methods shown and described herein, in any suitable order; program storage devices readable by machine, tangibly embodying a program of instructions executable by the machine to perform any or all of the operations of any of the methods shown and described herein, in any suitable order i.e.
- a computer program product comprising a computer useable medium having computer readable program code, such as executable code, having embodied therein, and/or including computer readable program code for performing, any or all of the operations of any of the methods shown and described herein, in any suitable order; any technical effects brought about by any or all of the operations of any of the methods shown and described herein, when performed in any suitable order; any suitable apparatus or device or combination of such, programmed to perform, alone or in combination, any or all of the operations of any of the methods shown and described herein, in any suitable order; electronic devices each including at least one processor and/or cooperating input device and/or output device and operative to perform, e.g., in software, any operations shown and described herein; information storage devices or physical records, such as disks or hard drives, causing at least one computer or other device to be configured so as to carry out any or all of the operations of any of the methods shown and described herein, in
- Any computations or other forms of analysis described herein may be performed by a suitable computerized method. Any operation or functionality described herein may be wholly or partially computer-implemented e.g., by one or more processors.
- the invention shown and described herein may include (a) using a computerized method to identify a solution to any of the problems or for any of the objectives described herein, the solution optionally including at least one of a decision, an action, a product, a service, or any other information described herein, that impacts, in a positive manner, a problem or objectives described herein; and (b) outputting the solution.
- the system may, if desired, be implemented as a network- e.g., web-based system employing software, computers, routers, and telecommunications equipment, as appropriate.
- a server may store certain applications, for download to clients, which are executed at the client side, the server side serving only as a storehouse.
- Any or all functionalities e.g., software functionalities shown and described herein, may be deployed in a cloud environment.
- Clients e.g., mobile communication devices such as smartphones, may be operatively associated with, but external to the cloud.
- the scope of the present invention is not limited to structures and functions specifically described herein, and is also intended to include devices which have the capacity to yield a structure, or perform a function, described herein, such that even though users of the device may not use the capacity, they are, if they so desire, able to modify the device to obtain the structure or function.
- any “if -then” logic described herein is intended to include embodiments in which a processor is programmed to repeatedly determine whether condition x, which is sometimes true and sometimes false, is currently true or false, and to perform y each time x is determined to be true, thereby to yield a processor which performs y at least once, typically on an “if and only if’ basis, e.g., triggered only by determinations that x is true, and never by determinations that x is false.
- any determination of a state or condition described herein, and/or other data generated herein, may be harnessed for any suitable technical effect.
- the determination may be transmitted or fed to any suitable hardware, firmware, or software module, which is known or which is described herein to have capabilities to perform a technical operation responsive to the state or condition.
- the technical operation may, for example, comprise changing the state or condition, or may more generally cause any outcome which is technically advantageous, given the state or condition or data, and/or may prevent at least one outcome which is disadvantageous, given the state or condition or data.
- an alert may be provided to an appropriate human operator, or to an appropriate external system.
- a system embodiment is intended to include a corresponding process embodiment, and vice versa.
- each system embodiment is intended to include a server-centered “view” or client centered “view”, or “view” from any other node of the system, of the entire functionality of the system, computer-readable medium, or apparatus, including only those functionalities performed at that server or client or node.
- Features may also be combined with features known in the art, and particularly, although not limited to, those described in the Background section or in publications mentioned therein.
- features of the invention including operations, which are described for brevity in the context of a single embodiment or in a certain order, may be provided separately or in any suitable sub-combination, including with features known in the art (particularly although not limited to those described in the Background section or in publications mentioned therein) or in a different order, "e.g.” is used herein in the sense of a specific example which is not intended to be limiting.
- Each method may comprise all or any subset of the operations illustrated or described, suitably ordered, e.g., as illustrated or described herein.
- Devices, apparatus, or systems shown coupled in any of the drawings may in fact be integrated into a single platform in certain embodiments, or may be coupled via any appropriate wired or wireless coupling, such as but not limited to optical fiber, Ethernet, Wireless LAN, HomePNA, power line communication, cell phone, Smart Phone (e.g., iPhone), Tablet, Laptop, PDA, Blackberry GPRS, Satellite including GPS, or other mobile delivery.
- any appropriate wired or wireless coupling such as but not limited to optical fiber, Ethernet, Wireless LAN, HomePNA, power line communication, cell phone, Smart Phone (e.g., iPhone), Tablet, Laptop, PDA, Blackberry GPRS, Satellite including GPS, or other mobile delivery.
- functionalities described or illustrated as systems and sub-units thereof can also be provided as methods and operations therewithin
- functionalities described or illustrated as methods and operations therewithin can also be provided as systems and sub-units thereof.
- the scale used to illustrate various elements in the drawings is merely exemplary and/or appropriate for clarity of presentation, and is not intended to be
- Any suitable communication may be employed between separate units herein e.g., wired data communication and/or in short-range radio communication with sensors such as cameras, e.g., via WiFi, Bluetooth, or Zigbee. It is appreciated that implementation via a cellular app as described herein is but an example, and, instead, embodiments of the present invention may be implemented, say, as a smartphone SDK, as a hardware component, as an STK application, or as suitable combinations of any of the above.
- Any processing functionality illustrated (or described herein) may be executed by any device having a processor, such as but not limited to a mobile telephone, set-top-box, TV, remote desktop computer, game console, tablet, mobile, e.g., laptop or other computer terminal, embedded remote unit, which may either be networked itself (may itself be a node in a conventional communication network e.g.) or may be conventionally tethered to a networked device (to a device which is a node in a conventional communication network, or is tethered directly or indirectly/ultimately to such a node).
- a processor such as but not limited to a mobile telephone, set-top-box, TV, remote desktop computer, game console, tablet, mobile, e.g., laptop or other computer terminal, embedded remote unit, which may either be networked itself (may itself be a node in a conventional communication network e.g.) or may be conventionally tethered to a networked device (to a device which
- processor or controller or module or logic as used herein are intended to include hardware, such as computer microprocessors or hardware processors, which typically have digital memory and processing capacity, such as those available from, say Intel and Advanced Micro Devices (AMD). Any operation or functionality or computation or logic described herein may be implemented entirely or in any part, on any suitable circuitry, including any such computer microprocessor/s, as well as in firmware or in hardware, or any combination thereof.
- any modules, blocks, operations, or functionalities described herein which are, for brevity, described in the context of a single embodiment, may also be provided separately or in any suitable sub-combination, including with features known in the art.
- Each element, e.g., operation described herein may have all characteristics and attributes described or illustrated herein, or, according to other embodiments, may have any subset of the characteristics or attributes described herein.
- references herein to “said (or the) element x” having certain (e.g., functional or relational) limitations/characteristics are not intended to imply that a single instance of element x is necessarily characterized by all the limitations/characteristics. Instead, “said (or the) element x” having certain (e.g. functional or relational) limitations/characteristics is intended to include both (a) an embodiment in which a single instance of element x is characterized by all of the limitations/characteristics and (b) embodiments in which plural instances of element x are provided, and each of the limitations/characteristics is satisfied by at least one instance of element x, but no single instance of element x satisfies all limitations/characteristics.
- each time L limitations/characteristics are ascribed to “said” or “the” element X in the specification or claims e.g. , to “said processor” or “the processor”
- the plural instances of element x need not be identical.
- element x is a hardware processor, there may be different instances of x, each programmed for different functions and/or having different hardware configurations (e.g., there may be 3 instances of x: two Intel processors of different models, and one AMD processor).
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Quality & Reliability (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Apparatus For Radiation Diagnosis (AREA)
Abstract
A method for providing decision support to human experts examining medical images, the method comprising using a hardware processor to generate simulations of with-contrast images and/or using an image display system to display the simulations of with-contrast images.
Description
Mammography Image Display System, Method and Computer Program Product Configured to Display a Synthetic Image which Simulates a With-Contrast Image
FIELD OF THIS DISCLOSURE
The present invention relates generally to medical imagery, and more particularly to use of contrast agents in medical imagery.
BACKGROUND
US 10499866 to Miriam Sklair-Levy, Arnaldo Mayer and Shmuel Yitzhak Pfeffer describes mammography apparatus. https : //patents . goo gle.com/patent/U S 10499866B 2/en
Contrast-enhanced mammography (CEM), aka contrast-enhanced spectral mammography (CESM) aka contrast-enhanced digital mammography (CEDM), administers contrast e.g., iodinated intravenous (IV) contrast before performing a mammogram. CEM is considered a possible test for breast lesions assessment inter alia when mammography and ultrasound are not conclusive or MRI is contraindicated or not available, e.g., as described here: https ://www.sciencedirect.com/science/article/pii/S0960977620301326
Contrast CT aka contrast enhanced computed tomography, is known and involves X-ray computed tomography using radiocontrast which is generally iodine-based, to highlight anatomy e.g. blood vessels that, absent a contrast agent, are hard to distinguish from their surroundings.
A generative adversarial network (GAN) typically comprises two neural networks: a generator which learns to generate plausible “fake” or simulated or synthetic data, and a discriminator e.g., classifier. The discriminator may have any network architecture, typically depending on the type of data which is being classified.
The instances provided by the generator serve as negative training examples for the discriminator which learns to distinguish the fake data, provided by the generator, from real data. The discriminator penalizes the generator when the generator produces results which are implausible. Thus, typically, generator output is directly connected to
discriminator input. B ackpropagation involves a signal provided by the discriminator’s classification to the generator, which, responsively, update its weights.
Typically, the discriminator connects to two loss functions (generator loss and discriminator loss). During its own training, the discriminator ignores generator loss and uses discriminator loss. Typically, during discriminator training, the discriminator classifies real data and fake data, and the discriminator loss penalizes or punishes the discriminator for classifying real data as fake, or vice versa. The discriminator updates its weights through backpropagation from the discriminator loss via the discriminator network.
During the generator’s training, the generator loss is used.
Typically, GAN training proceeds in alternating periods; thus, the discriminator may train for one or more epochs, followed by epoch/s of generator training, and so forth, until the generator and discriminator networks are trained.
Various metrics for use in GAN evaluation are known e.g., as described here: https://arxiv.org/pdf/1802.03446.pdf .
Perceptual loss functions are described here: https ://deepai.org/machine-learning- glossary-and-terms/perceptual-loss- function#:~:text=What%20is%20a%20Perceptual%20Loss,and%20style%20discrepancie s%2C%20between%20images.
Histogram equalization is a known image processing method used to improve image contrast.
Spatially-Adaptive Normalization (SPADE) is described inter alia at the following online reference: nvlabs.github.io/SPADE/.
Semantic segmentation is known and is described online e.g. here: superannotate.com/blog/guide-to-semantic-segmentation.
The disclosures of all publications and patent documents mentioned in the specification, and of the publications and patent documents cited therein directly or indirectly, are hereby incorporated by reference, other than subject matter disclaimers or disavowals. If the incorporated material is inconsistent with the express disclosure herein, the interpretation is that the express disclosure herein describes certain embodiments,
whereas the incorporated material describes other embodiments. Definition/s within the incorporated material may be regarded as one possible definition for the term/s in question.
SUMMARY OF CERTAIN EMBODIMENTS
Certain embodiments seek to provide a mammography image display system which is configured to display a synthetic mammogram of a breast as though the image had been captured after administration e.g., injection of a dosage D of a contrast agent A. The synthetic mammogram may be generated by a system trained e.g., as per herein, from a mammogram captured without administering a contrast agent, or captured after administering a different contrast agent (other than A), or captured after administering a dosage d < D of contrast agent A.
Certain embodiments seek to digitally clean a mammography image.
Certain embodiments seek to use image processing to facilitate decision-making by humans which conventionally relies at last partly on examining contrast-enhanced medical imagery. The terms “contrast-enhanced” and “with-iodine” and “with-contrast” may be interchanged here within.
Embodiments herein are described in terms of mammography of a breast by way of example; more generally, any reference herein to a breast may be replaced by a reference to any other body portion and any reference to mammography may be replaced by a reference to any other imaging technology which employs a contrast agent.
Certain embodiments of the present invention seek to provide circuitry typically comprising at least one processor in communication with at least one memory, with instructions stored in such memory executed by the processor to provide functionalities which are described herein in detail. Any functionality described herein may be firmware- implemented or processor-implemented, as appropriate.
It is appreciated that any reference herein to, or recitation of, an operation being performed is, e.g., if the operation is performed at least partly in software, intended to include both an embodiment where the operation is performed in its entirety by a server A, and also to include any type of “outsourcing” or “cloud” embodiments in which the operation, or portions thereof, is or are performed by a remote processor P (or several such), which may be deployed off-shore or “on a cloud”, and an output of the operation is
then communicated to, e.g., over a suitable computer network, and used by, server A. Analogously, the remote processor P may not, itself, perform all of the operations, and, instead, the remote processor P itself may receive output/s of portion/s of the operation from yet another processor/s P', may be deployed off-shore relative to P, or “on a cloud”, and so forth.
The present invention typically includes at least the following embodiments:
Embodiment 1. A method for providing decision support e.g. to human experts examining medical images, the method comprising: using a hardware processor to generate simulations of with-contrast images; and/or using an image display system to display the simulations of with-contrast images.
Embodiment 2. The method of any of the preceding embodiments wherein the images comprise mammography images and the image display system comprises a mammography image display system or mammography monitor which may use DICOM Image Format.
Embodiment 3. The method of any of the preceding embodiments wherein the mammography images comprise 2D mammography images.
Embodiment 4. The method of any of the preceding embodiments wherein the mammography images comprise 3D mammography images.
Embodiment 5. The method of any of the preceding embodiments wherein the with-contrast images are generated after first administering iodine to a subject whose body is being imaged.
Embodiment 6. The method of any of the preceding embodiments and also comprising administering the iodine to the subject whose body is being imaged.
Embodiment 7. The method according to any of the preceding embodiments wherein a generative neural network is used to generate the simulations.
Embodiment 8. The method according to any of the preceding embodiments wherein the generative neural network is trained on image pairs including a first image of a given body portion which is not enhanced with any contrast agent, and a second image of the given body portion which is with-contrast, wherein the with-contrast image is captured by capturing an image of the body portion after a contrast agent, e.g., dye has been administered to the patient.
Embodiment 9. The method of any of the preceding embodiments wherein the with-contrast image is captured by performing mammography on a patient to whom contrast dye has been administered.
It is appreciated that such pairs of images are in practice available e.g. because contrast enhancement is used after a previous non-enhanced image (captured without first administering any contrast agent) of the same patient is deemed inconclusive or because a priori, a medical decision is made to generate an enhanced (with-contrast) and nonenhanced (without-contrast) image of the same organ in the same patient.
Embodiment 10. A decision support system serving human experts examining medical images, the system comprising: a hardware processor generating simulations of with-contrast images; and a computer display receiving the simulations of with-contrast images and displaying the simulations to at least one human expert, thereby to provide decision support to the human expert.
Embodiment 11. The method of any of the preceding embodiments wherein the images comprise ultrasound images.
Embodiment 12. The method of any of the preceding embodiments wherein the images comprise CT images.
Embodiment 13. The method of any of the preceding embodiments wherein the images comprise MRI images.
Embodiment 14. The method of any of the preceding embodiments wherein the simulations of with-contrast images are generated computationally rather than by actually administering a contrast agent to a patient, and subsequently capturing an image of a body part of interest.
Embodiment 15. The method of any of the preceding embodiments wherein the simulations of the with-contrast images are generated from without-contrast images.
Embodiment 16. The method of any of the preceding embodiments wherein the simulations of the with-contrast images simulate images captured after administration of a first dosage D of a contrast agent, and wherein the simulations of the with-contrast images are generated from and/or trained on images captured after administration of a second dosage d < D of the contrast agent.
It is appreciated that the simulations of the with-contrast images may simulate images captured after administration of a first contrast agent and may be generated from and/or trained on images captured after administration of a second (different) contrast agent.
Embodiment 17. A system comprising at least one hardware processor configured to carry out the operations of any of the methods shown and described herein.
Embodiment 18. A computer program product, comprising a non-transitory tangible computer readable medium having computer readable program code embodied therein, the computer readable program code adapted to be executed to implement a method for providing decision support to human experts examining medical images, the method comprising: using a hardware processor to generate simulations of with-contrast images; and using an image display system to display the simulations of with-contrast images.
Embodiment 19. The method of any of the preceding embodiments wherein the images comprise tomography images.
Also provided, excluding signals, is a computer program comprising computer program code means for performing any of the methods shown and described herein when the program is run on at least one computer; and a computer program product, comprising a typically non-transitory computer-usable or -readable medium e.g. non-transitory computer -usable or -readable storage medium, typically tangible, having a computer readable program code embodied therein, the computer readable program code adapted to be executed to implement any or all of the methods shown and described herein. The operations in accordance with the teachings herein may be performed by at least one computer specially constructed for the desired purposes, or a general purpose computer specially configured for the desired purpose by at least one computer program stored in a typically non-transitory computer readable storage medium. The term "non-transitory" is used herein to exclude transitory, propagating signals or waves, but to otherwise include any volatile or non-volatile computer memory technology suitable to the application.
Any suitable processor/s, display and input means may be used to process, display, e.g., on a computer screen or other computer output device, store, and accept information such as information used by or generated by any of the methods and apparatus shown and
described herein; the above processor/s, display and input means including computer programs, in accordance with all or any subset of the embodiments of the present invention. Any or all functionalities of the invention shown and described herein, such as but not limited to operations within flowcharts, may be performed by any one or more of: at least one conventional personal computer processor, workstation or other programmable device or computer or electronic computing device or processor, either general-purpose or specifically constructed, used for processing; a computer display screen and/or printer and/or speaker for displaying; machine -readable memory such as flash drives, optical disks, CDROMs, DVDs, BluRays, magnetic-optical discs or other discs; RAMs, ROMs, EPROMs, EEPROMs, magnetic or optical or other cards, for storing, and keyboard or mouse for accepting. Modules illustrated and described herein may include any one or combination or plurality of: a server, a data processor, a memory/computer storage, a communication interface (wireless (e.g., BLE) or wired (e.g., USB)), a computer program stored in memory/computer storage.
The term "process" as used above is intended to include any type of computation or manipulation or transformation of data represented as physical, e.g. electronic, phenomena which may occur or reside, e.g., within registers and /or memories of at least one computer or processor. Use of nouns in singular form is not intended to be limiting; thus, the term processor is intended to include a plurality of processing units which may be distributed or remote, the term server is intended to include plural typically interconnected modules running on plural respective servers, and so forth.
The above devices may communicate via any conventional wired or wireless digital communication means, e.g., via a wired or cellular telephone network or a computer network such as the Internet.
The apparatus of the present invention may include, according to certain embodiments of the invention, machine readable memory containing or otherwise storing a program of instructions which, when executed by the machine, implements all or any subset of the apparatus, methods, features, and functionalities of the invention shown and described herein. Alternatively, or in addition, the apparatus of the present invention may include, according to certain embodiments of the invention, a program as above which may be written in any conventional programming language, and optionally a machine for
executing the program, such as but not limited to a general-purpose computer which may optionally be configured or activated in accordance with the teachings of the present invention. Any of the teachings incorporated herein may, wherever suitable, operate on signals representative of physical objects or substances.
The embodiments referred to above, and other embodiments, are described in detail in the next section.
Any trademark occurring in the text or drawings is the property of its owner and occurs herein merely to explain or illustrate one example of how an embodiment of the invention may be implemented.
Unless stated otherwise, terms such as, "processing", "computing", "estimating", "selecting", "ranking", "grading", "calculating", "determining", "generating", "reassessing", "classifying", "generating", "producing", "stereo-matching", "registering", "detecting", "associating", "superimposing", "obtaining", "providing", "accessing", "setting" or the like, refer to the action and/or processes of at least one computer/s or computing system/s, or processor/s or similar electronic computing device/s or circuitry, that manipulate and/or transform data which may be represented as physical, such as electronic, quantities, e.g., within the computing system's registers and/or memories, and/or may be provided on-the-fly, into other data which may be similarly represented as physical quantities within the computing system's memories, registers or other such information storage, transmission or display devices or may be provided to external factors, e.g., via a suitable data network. The term “computer” should be broadly construed to cover any kind of electronic device with data processing capabilities, including, by way of non-limiting example, personal computers, servers, embedded cores, computing system, communication devices, processors (e.g., digital signal processor (DSP), microcontrollers, field programmable gate array (FPGA), application specific integrated circuit (ASIC), etc.) and other electronic computing devices. Any reference to a computer, controller, or processor, is intended to include one or more hardware devices e.g., chips, which may be co-located or remote from one another. Any controller or processor may, for example, comprise at least one CPU, DSP, FPGA or ASIC, suitably configured in accordance with the logic and functionalities described herein.
Any feature or logic or functionality described herein may be implemented by processor/s or controller/s configured as per the described feature or logic or functionality, even if the processor/s or controller/s are not specifically illustrated for simplicity. The controller or processor may be implemented in hardware, e.g., using one or more Application-Specific Integrated Circuits (ASICs) or Field-Programmable Gate Arrays (FPGAs), or may comprise a microprocessor that runs suitable software, or a combination of hardware and software elements.
The present invention may be described, merely for clarity, in terms of terminology specific to, or references to, particular programming languages, operating systems, browsers, system versions, individual products, protocols and the like. It will be appreciated that this terminology or such reference/s is intended to convey general principles of operation clearly and briefly, by way of example, and is not intended to limit the scope of the invention solely to a particular programming language, operating system, browser, system version, or individual product or protocol. Nonetheless, the disclosure of the standard or other professional literature defining the programming language, operating system, browser, system version, or individual product or protocol in question, is incorporated by reference herein in its entirety.
Elements separately listed herein need not be distinct components, and alternatively may be the same structure. A statement that an element or feature may exist is intended to include (a) embodiments in which the element or feature exists; (b) embodiments in which the element or feature does not exist; and (c) embodiments in which the element or feature exist selectably, e.g., a user may configure or select whether the element or feature does or does not exist.
Any suitable input device, such as but not limited to a sensor, may be used to generate or otherwise provide information received by the apparatus and methods shown and described herein. Any suitable output device or display may be used to display or output information generated by the apparatus and methods shown and described herein. Any suitable processor/s may be employed to compute or generate or route, or otherwise manipulate or process information as described herein and/or to perform functionalities described herein and/or to implement any engine, interface or other system illustrated or described herein. Any suitable computerized data storage e.g., computer memory, may be
used to store information received by or generated by the systems shown and described herein. Functionalities shown and described herein may be divided between a server computer and a plurality of client computers. These or any other computerized components shown and described herein may communicate between themselves via a suitable computer network.
The system shown and described herein may include user interface/s e.g. as described herein, which may, for example, include all or any subset of: an interactive voice response interface, automated response tool, speech-to-text transcription system, automated digital or electronic interface having interactive visual components, web portal, visual interface loaded as web page/s or screen/s from server/s via communication network/s to a web browser or other application downloaded onto a user's device, automated speech-to-text conversion tool, including a front-end interface portion thereof and back-end logic interacting therewith. Thus, the term user interface or “UI” as used herein includes also the underlying logic which controls the data presented to the user e.g., by the system display and receives and processes and/or provides to other modules herein, data entered by a user e.g., using her or his workstation/device.
BRIEF DESCRIPTION OF THE DRAWINGS
Certain embodiments of the present invention are illustrated in the following drawings; in the block diagrams, arrows between modules may be implemented as APIs and any suitable technology may be used for interconnecting functional components or modules illustrated herein in a suitable sequence or order e.g. via a suitable A Pl/In terface. For example, state of the art tools may be employed, such as but not limited to Apache Thrift and Avro which provide remote call support. Or, a standard communication protocol may be employed, such as but not limited to HTTP or MQTT, and may be combined with a standard data format, such as but not limited to JSON or XML. According to one embodiment, one of the modules may share a secure API with another. Communication between modules may comply with any customized protocol or customized query language or may comply with any conventional query language or protocol.
Fig. 1 is a simplified flowchart illustration of a method for presenting, e.g., to a human user, simulations of with-contrast images of a patient’s body part generated from
without-contrast images of the patient’s body part. In this method and all others herein, all or any subset of the illustrated operations may be used, in any suitable order e.g., as shown.
Fig. 2 is a simplified flowchart illustration of a method for preprocessing images in the dataset of Fig. 1 (or Fig. 3).
Fig. 3 is a simplified flowchart illustration of a method for presenting, e.g., to a human user, simulations of dose D-contrast images of a patient’ s body part generated from images of the body part captured after administration of a dose d < D to the patient.
Methods and systems included in the scope of the present invention may include any subset or all of the functional blocks shown in the specifically illustrated implementations by way of example, in any suitable order, e.g., as shown. Flows e.g. those of Figs. 1, 2 and 3 may include all or any subset of the illustrated operations, suitably ordered e.g., as shown. Tables herein may include all or any subset of the fields and/or records and/or cells and/or rows and/or columns described.
Computational, functional, or logical components described and illustrated herein can be implemented in various forms, for example, as hardware circuits, such as but not limited to custom VLSI circuits or gate arrays or programmable hardware devices such as but not limited to FPGAs, or as software program code stored on at least one tangible or intangible computer readable medium and executable by at least one processor, or any suitable combination thereof. A specific functional component may be formed by one particular sequence of software code, or by a plurality of such, which collectively act or behave or act as described herein with reference to the functional component in question. For example, the component may be distributed over several code sequences such as but not limited to objects, procedures, functions, routines and programs, and may originate from several computer files which typically operate synergistically.
Each functionality or method herein may be implemented in software (e.g., for execution on suitable processing hardware such as a microprocessor or digital signal processor), firmware, hardware (using any conventional hardware technology such as Integrated Circuit technology), or any combination thereof.
Functionality or operations stipulated as being software-implemented may alternatively be wholly or fully implemented by an equivalent hardware or firmware module, and vice-versa. Firmware implementing functionality described herein, if
provided, may be held in any suitable memory device and a suitable processing unit (aka processor) may be configured for executing firmware code. Alternatively, certain embodiments described herein may be implemented partly or exclusively in hardware, in which case all or any subset of the variables, parameters, and computations described herein may be in hardware.
Any module or functionality described herein may comprise a suitably configured hardware component or circuitry. Alternatively or in addition, modules or functionality described herein may be performed by a general purpose computer, or more generally by a suitable microprocessor, configured in accordance with methods shown and described herein, or any suitable subset, in any suitable order, of the operations included in such methods, or in accordance with methods known in the art.
Any logical functionality described herein may be implemented as a real time application, if and as appropriate, and which may employ any suitable architectural option such as but not limited to FPGA, ASIC, or DSP, or any suitable combination thereof.
Any hardware component mentioned herein may in fact include either one or more hardware devices e.g., chips, which may be co-located or remote from one another.
Any method described herein is intended to include, within the scope of the embodiments of the present invention, also any software or computer program performing all or any subset of the method’s operations, including a mobile application, platform, or operating system, e.g., as stored in a medium, as well as combining the computer program with a hardware device to perform all or any subset of the operations of the method.
Data can be stored on one or more tangible or intangible computer readable media stored at one or more different locations, different network nodes or different storage devices at a single node or location.
It is appreciated that any computer data storage technology, including any type of storage or memory and any type of computer components and recording media that retain digital data used for computing for an interval of time, and any type of information retention technology, may be used to store the various data provided and employed herein. Suitable computer data storage or information retention apparatus may include apparatus which is primary, secondary, tertiary or off-line; which is of any type or level or amount or category of volatility, differentiation, mutability, accessibility, addressability, capacity,
performance, and energy use; and which is based on any suitable technologies such as semiconductor, magnetic, optical, paper, and others.
DETAILED DESCRIPTION OF CERTAIN EMBODIMENTS
Contrast-enhanced mammography (CEM) is considered useful for all or any subset of the following reasons:
1. Found as accurate as MRI (based on dozens of comparative studies of Contrast- enhanced mammography (CEM) vs. MRI
2. Much less costly
3. High availability, since the same mammography machine can be used, and CEM can be done at the same time, e.g., at a single sitting; and
4. Time to read - faster than MRI;
5. Done by a certified reader of mammography who has trained for mammography.
The system of the present invention may include a hardware processor which is configured to generate a synthetic image that simulates mammography with use of a contrast agent and/or a platform that makes the synthetic image accessible to personnel, e.g., radiologists ,etc. e.g., to allow the presentation a cancerous/benign score to be assigned to any lesions observed in the synthetic image; the radiologist may then make a decision as to whether a biopsy or additional tests are necessary, and/or this may be determined as a function of (at least) the cancerous/benign score. According to certain embodiments, the system accepts a marking indicating a location of putative lesionon the synthetic image and displays a score for that lesion where the score represents a recommendation whether to send the lesion for biopsy, or not based on a trained model which may accept the synthetic images as an input.
Certain embodiments seek to provide a system, computer program product or method for decision support e.g. for diagnosing cancer such as but not limited to breast cancer, the method comprising: generating simulations of contrast-enhanced mammography images from non-enhanced mammography images; and/or displaying the simulations of contrast-enhanced mammography images to at least one human expert,
thereby to facilitate generation of breast cancer (say) diagnostic outputs by the human expert.
The simulations of contrast-enhanced images are typically generated computationally e.g., as described herein, rather than actually administering a contrast agent to a patient and subsequently capturing an image of a body part of interest. Such simulated images are also termed herein “artificial” or “synthetic”.
The system may use GAN to generate the simulations. Training the GAN may include training a generator to generate synthetic contrast-enhanced mammography images of respective breasts (or any other body portion) from real, no-contrast images of the same breasts; and/or training a discriminator to distinguish between real contrast-enhanced mammography images of breasts and synthetic images generated by the generator for the same breasts; and/or using outputs of the discriminator to refine the generator, e.g., as in conventional GAN.
Fig. 1 is a simplified flowchart illustration of a method according to certain embodiments; all or any subset of the illustrated operations may be performed, in any suitable order e.g., as shown. Specifically, the method typically includes offline operations and runtime operations. The offline or setup operations may include:
Operation 10. Provide a dataset of pairs of mammography images where each pair typically includes a first (aka without-iodine), e.g., Low Energy, Mammography image of a breast captured without or before iodine injection and a second (aka with- iodine), e.g., Contrast-enhanced mammography (CEM), Mammography image of the same breast after i.e., with iodine injection. Typically, each image is in DICOM Image Format.
Operation 20. Preprocess each image in the dataset, e.g., by performing all or any subset of the operations in Fig. 2, in any suitable order e.g., as shown.
Operation 30. Using pairs from the dataset provided in operation 10 as a training set, train a first, “no iodine, expert-decision supporting” generator network (e.g., of type UNET, UNET++, Attention, Transformer, Diffusion) to convert a without-iodine Mammography image of a breast (aka input image) into a with-iodine Mammography image (aka output image, synthetic image) of the same breast.
The runtime operations in Fig. 1, which are also termed herein ACEM (artificial contrast-enhanced mammography) operations when applied by way of example to the field of imaging breasts, may include:
Operation 40. Use the generator network trained in operation 30, to convert a without-iodine image of a breast (say) captured from a patient into a synthetic with-iodine image of the same breast.
Operation 50. Use a mammography (say) image display system to display the synthetic with-iodine image e.g., to a human expert, e.g., to facilitate the human expert’s decision- making regarding the patient.
Optionally, decision-making regarding patients may be enhanced by machinelearning the human expert’s decisions. This may, for example, include training a second, “no iodine, no expert” generator network to convert a without-iodine image into a synthetic human expert’s decision. To do this, human expert decisions for each without-iodine image may be recorded, using a suitable user interface into which the human expert enters his decisions, and may be stored in memory in association with the without-iodine image. A training set may be generated accordingly, and a generator network may be trained on this training set. Once the second, “no iodine, no expert” generator network has been trained (or alternatively, a network may be trained using reduced-dosage iodine images), it is no longer necessary, when evaluating a patient, to use the first, “no iodine, expert-decision supporting” generator network to generate a synthetic, with-iodine image of the patient’s breast, display same, e.g., to a human expert, and then elicit a decision from the human expert about the patient. Instead, the second, “no iodine, no expert” generator network may be used to automatically generate the decision about the patient, without generating a synthetic, with-iodine image of the patient’s breast, and without resorting to a human expert.
All or any subset of the following may characterize operation 30 in Fig. 1 : i. the error between each output image generated by the generator network from a first image I in the dataset provided in operation 10, on the one hand, and the second image paired to the first image I on the other hand, is minimized. Any suitable loss function may be used for error computation such as LI, L2, KLD, SSIM. It is appreciated that any suitable loss function may be used to quantify the loss between the predicted (or output)
image and target (or second, paired) image; LI, L2, KLD, SSIM being merely non-limiting examples. ii. a tissue mask, e.g. as created for a first input image I by the method of Fig. 2, may be used to filter out portions of the input image I that do not carry relevant information, using any suitable criterion to determine relevance.
The mask is typically created in advance, e.g., upon receiving the image at the entrance of the generator, to save time in training, relative to creating the same mask in each iteration.
It is appreciated that the mask may be applied to the image at any suitable point in the course of the method. For example, during training and/or during runtime, the input to the generator network in each iteration may either be the raw image as captured, or the masked image.
Application of the tissue region mask aka tissue mask typically results in the network disregarding or ignoring irrelevant portions of the image when creating each artificial CEM. iii. the lesion location mask, e.g., as created for a first input image I by the method of Fig. 2, may be used, e.g., to reduce error and/or increase precision in the area of the lesion.
Use of the tissue region mask may concentrate creation of the synthetic image on the relevant part of the image e.g. by causing the generator network to invest computational resources in image portions where it is known in advance that there is no tissue. The lesion location mask may be used to direct maximum attention to the image portion which putatively includes the lesion itself, which is the main area of interest. iv. an externally trained network (e.g., VGG19) may be used for perceptual loss enforcement, e.g., by forcing the system, while training, to produce more similar images. VGG19’s feature extraction technique may be used while training the generator to punish or penalize the generator when it produces an image that is not similar to the CSEM the generator is aiming to create. v. an alignment network (e.g., Optical Flow) may be used to refine or reduce misalignment between the first and second images in any given pair in the dataset due to
slight movement of the subject after the first image in the pair was captured and before and/or while the second image is being captured. Typically, when entering the generator network, a first mammogram image (e.g., captured with or without iodine) is entered.
Typically, upon exiting the network, the resulting synthetic mammography image (e.g., as generated by the system from a without-iodine image) is desired to be as identical as possible, to the image actually physically captured under iodine injection. To force the output image of the generator network to this identity, the synthetic image may, e.g., in each training iteration, be compared, pixel by pixel, to an image under real iodine injection. In order to get a more accurate LI score and/or adequately assess the precision of the ACEM, the alignment network may detect a shift each time the original image, captured with or without iodine, shifts a little, and, accordingly, fix the image, by translating until the shifted image matches the network's output image. Typically, this has no effect on the image that comes out of the network but is useful for comparing images. vi. a trained discriminator network may be used to enhance the generator network, e.g., by applying pressure thereto, e.g., using any suitable conventional GAN (generative adversarial network) technique. Such a discriminator may be trained to receive triplets of images, each triplet including a first image F from the dataset, and 2 additional images including Al, the second image paired to the first image in the dataset and A2, an output image generated by feeding the first image F to the generator network, and to determine which of the two additional images is Al and which is A2, and/or to output a probability that (without loss of generality) the first of the two additional images is Al and/or a probability that the first of the two additional images is A2. According to certain embodiments, in some triplets received by the discriminator, the second image in the triplet is Al and the third image is A2, whereas in other triplets received by the discriminator, the second image in the triplet is A2 and the third image is Al; this may be determined randomly.
It is appreciated that training may employ any suitable Generator Network typically comprising an encoder, decoder, and intermediate skip connections. The network input may comprise Low-Energy mammography images and may generate an output image that is forced to resemble a contrast-enhanced mammography (CESM) image. Images may be streamed to the network iteratively from the dataset. The constraint function’s type may
Y1
be, say, LI, L2, KLD, SSIM, etc. External constraints which may be used include perceptual loss and/or use of a variational autoencoder. An alignment network may be used to refine mismatch between the network’s target image, which corresponds to the input Low-Energy image, and a target which is slightly different (e.g. due to slight movement of the subject in the source image - CESM). After each iteration, an optimizer (of a suitable type e.g. ADAM, SGD, RMSPROP) may be employed to adjust the network's weights to achieve better results in the next iteration. Training may employ any suitable Discriminator Network e.g. an encoder network that learns to distinguish between real contrast-enhanced mammography images and synthetic images generated by the generator network. Typically, the better the discriminator network distinguishes between the images, the more the generator network strives to generate even more realistic images. Patch GAN (a known type of discriminator for generative adversarial networks) may be used to highlight different regions in the image.
Also, any suitable post-Processing may be employed. For example, CESM patches may be cut and overlaid onto respective generated CESM breast images. Integration may for example occur at the original patch cutting location.
Fig. 2 is a simplified flowchart illustration of a method configured to preprocess each image in the dataset, including both with-iodine images and without-iodine images, according to certain embodiments; all or any subset of the illustrated operations may be performed, in any suitable order, e.g., as shown. Specifically, the method typically includes: a. Normalization, zero centering and standardization of images e.g., as described here: https://www.imaios.com/en/resources/blog/ct-images-normalization-zero-centering-and- standardization b. Create a tissue region mask (e.g., binary image), aka “tissue mask” to differentiate the breast (e.g., “1” in the mask) from irrelevant portions of the image (e.g.,
“0” in the mask) such as the room in which imaging occurred or parts of the patient’s body other than the breast, such as the patient’s arm.
Tissue region masks concentrate the synthetic image generation process on pertinent areas of interest (potentially, relevant anatomical structures) within the image rather than on, say, areas devoid of tissue, by guiding the generator network toward allocating lesser computational resources to regions devoid of tissue. c. Create a mask for the lesion location (e.g. binary image), aka “lesion mask” e.g., by having human experts such as radiologists examine the mammogram images in the dataset and mark lesions, if any, in the image e.g., by outlining the lesion with a marker. The mask may then be a binary image, the size of the image in the dataset, in which the putative lesion as identified by the radiologist is marked with 1 , and all other portions of the image are marked with 0.
Use of lesion location masks focusses computational resources toward image regions likely to encompass the lesion, rather than regions less likely to encompass the lesion. To achieve such targeted modulation Spatially-Adaptive Normalization (SPADE) may be employed which, e.g. via integration of an affine layer learned from semantic segmentation maps, may facilitate dynamic modulation of activations in normalization layers. The spatially-adaptive, learned transformation leverages input layouts to tailor generator's output to anatomical feature/s, ensuring optimal synthesis fidelity. At each stage of the network, where after the convolution operation there is a normalization layer, the image values are adjusted to normal values, so that the average of the entire image will be 0 and the standard deviation will be 1. When using the SPADE process, the normalization for each lesion area will depend in the statistics that are unique to it, and not necessarily depending on the whole image as a whole.
Typically, two separate masks are provided (a tissue mask and a lesion mask), as opposed to a single mask differentiating the lesion area from all other portions of the image, since it is advantageous for the tissue mask to indicate to the computer or network that certain portions of the image are entirely irrelevant and may be ignored, when creating the artificial CSEM. The lesion mask, on the other hand, is used to indicate to the computer or network that output for the area marked lesion is to be more detailed than output created
for the non-lesion area, however the non-lesion areas of the breast tissue (unlike the nontissue areas) should not be entirely ignored.
Regarding image size, any suitable resizing algorithms may be used to efficiently produce an output image, such as but not limited to progressive growing or generation by parts accurate ACEM. Size adjustment may include all or any subset of maximum size usage, and/or Downsampling for resizing the image to a fixed size for all images (512X256, 2048X1024, etc.), padding with zeros to a fixed size, preserving high resolution by cropping the original image into patches, and using patches that contain only lesions.
Any suitable Mammography Image Display System may be used to view mammography images, e.g. in DICOM Image Format such as, for example, a RadiForce mammography monitor e.g., as described here: https://www,eizo
It is appreciated that, typically, the discriminator is not used in runtime, and instead is used (as a constraint e.g.) to train the generator which, once trained, accepts images from patients and performs its function as per the method described herein in runtime. Alternatively, or in addition, the discriminator can be used in real time as well, e.g., to ensure that the image that enters the generator is actually a mammogram; if an individual attempts to use the network on any other type of image, the discriminator may detect this.
Re GAN iterations, any suitable stopping criteria may be employed e.g., to determine when to terminate training. For example, human observers may be tasked with determining whether artificially generated with-iodine images do or do not seem real; if the former is the case, GAN training may be terminated. Alternatively, or in addition, loss values may be examined (e.g., to determine whether iterations are producing only a small difference in loss values), as well as other training success metrics such as Inception Scores, Frechet Inception Distances (FID scores), or perceptual similarity measures (EPIPS), or any other GAN evaluation measure known in the art. For example, the above quantitative metrics may be used to determine whether to impose early stopping e.g., whether to stop training when FID scores worsen or perceptual similarity is not improving over iterations.
Expert decisions which may be supported using any embodiment herein may include all or any subset or any combination of the following:
Diagnostic Decisions yes/no cancerous type of cancer e.g. invasive ductal carcinoma vs. invasive lobular carcinoma, stages of cancer e.g. breast cancer stages 0, 1, II, III, and IV.
Monitoring decisions e.g.
Is monitoring required at all - yes/no biopsy (e.g., FNAC and/or core biopsy) yes/no frequency of monitoring type of monitoring e.g., physical examination, mammography and/or ultrasound parameterization of monitoring e.g., re-doing mammography from different angle/s imagery after physically administering a contrast agent e.g., if there is any doubt regarding the validity of the synthetic imagery generated according to embodiments of the invention.
Treatment decisions e.g.
Yes/no treat
Which treatment option e.g., none, surgery, chemotherapy, hormonal therapy, biological therapy, hormone therapy, targeted therapy, radiation therapy, a combination thereof.
Characterization of a selected treatment option e.g., if surgery is the selected treatment option, complete mastectomy vs. partial mastectomy, segmental mastectomy, Lumpectomy or other breast-sparing surgery and/or choice of chemotherapeutic agent/s and/or number of treatments and/or length of chemotherapy cycle, e.g., once a week vs. once every three weeks.
Any type of monitoring and/or treatment may be performed on the patient, depending on the expert decision (whether obtained from a human or whether simulated using the second generative network described herein).
It is appreciated that the system herein may be used in conjunction with Al-based commercial platforms which provide decision-support to radiologists tasked with interpreting mammography images, such as DENISE, developed in conjunction with Sheba Hospital , Tel haShomer, Israel.
For example, each time attaining a CEM aka with-contrast image e.g., with-iodine image, as input to the DENISE is too costly or is otherwise not feasible, the system herein may be employed to generate an artificial CSEM or simulated with-contrast image as input to the DENISE, which can then provide decision support to radiologists or other medical professionals.
More generally, any embodiment described in US 10499866 to Miriam Sklair-Levy et al may be used in conjunction with any embodiment herein. For example, the Sklair- Levy patent describes a system and method diagnosing breast cancer by acquiring a contrast enhanced region of interest (CE-ROI) which may be comprised in an X-ray image of a patient's breast, the X-ray image comprising X-ray pixels that indicate intensity of X- rays that passed through the breast to generate the image; generating a texture feature vector (TF) having components based on the indications of intensity provided by a plurality of X- ray pixels in the CE-ROI; and using a classifier to classify the texture feature vector TF to determine whether the CE-ROI is malignant. It is appreciated that the contrast enhanced region of interest may be generated using any embodiment herein, rather than by actually administering a contrast agent to the patient and subsequently imaging the region of interest.
It is appreciated that any suitable type of Generative Adversarial Networks (GANs) may be employed e.g. Vanilla GAN, Conditional Gan (CGAN), Deep Convolutional GAN (DCGAN), Wasserstein generative adversarial networks (aka Wasserstein GANs, WGANs), WGAN-LPs (e.g. as described herein https://wasachenev.github.io/publication/2018-12-01-lpwgan ).
Also, instead of or in addition to using Generative Adversarial Networks (GANs) e.g., as described herein, any suitable alternative may be used such as but not limited to Convolutional Neural Networks (CNNs), transformers, and diffusion models.
It is appreciated that the system herein is applicable both to 2D mammography and to 3D mammography. During 2D (aka conventional digital) mammography, each breast is typically imaged twice, one from the side, and one from above (without and perhaps also with iodine). During 3D mammography (aka digital breast tomosynthesis), more than 2 images (without and perhaps also with iodine) are taken of the breast from more than 2 respective angles; the images are then typically combined to yield a 3D image or model of the breast. Either way, pairs of images captured of a given breast from a given angle, with and without iodine, may be used to train the system herein, and once the system is trained, the system may derive a simulated with-contrast image from a without-contrast image provided. Alternatively, if n > = 2 without-contrast images of a breast are captured, from n respective angles, for various breasts, and in addition, n > = 2 with-contrast images of the same breast are captured, from the same n respective angles, for the same set of breasts/patients, sets of n image pairs may be used to train a GAN to generate a 3D with- contrast image or model of a breast from n such image pairs. Alternatively, pairs of 3D breast models, with and without contrast, may be used to train a GAN to generate a 3D with-contrast image or model of a breast from a 3D without-contrast image or model of a breast.
It is appreciated that the system herein is applicable not only to mammography but to other scans of body parts other than the breast, which may be scanned using any suitable technology e.g., low energy scans or CSEM or tomography (e.g. 2D or 3D tomography). Thus, for example, matching low-energy CT (LECT) images, with and without contrast, may be used to train a system to receive a low-energy CT (LECT) image without contrast, and to generate therefrom a with-contrast low-energy CT (LECT) image.
For example, references to mammography images in the description of Figs. 1 and 2 are merely by way of example ; alternatively, ultrasound images or tomography e.g. CT images (including low energy CT) or MRI images may be employed.
Fig. 2 is but one possible method for preprocessing images with and/or without contrast agent in the dataset; any other suitable method may alternatively be used.
References to iodine here are also merely exemplary ; alternatively, any suitable contrast material other than iodine may be employed. Thus references to « with iodine » images herein may more generally refer to any contrast-enhanced image.
Also, the system according to certain embodiments may be configured to reduce use of a contrast agent e.g., iodine, when capturing medical imagery. For example, e.g., as shown in Fig. 3, a system may receive an image generated using a first (“low”) dose of iodine (or any other contrast agent), and may generate a synthetic image which simulates the image that would result given a second (“high”) dose of iodine, which is higher than the first dose. The high dose may be that conventionally used, e.g., 1.5 Ml per kilogram of body weight and the low dose may be some percentage of that e.g., less than 5%, 10%, 25%, 33%, 50%, or 75%, or more. The system may be trained using sets of images each obtained from a given patient including one image of a body part e.g., the patient’s breast imaged after a low dose of iodine, and another image of the same patient’ s same body part after a high dose of iodine.
According to certain embodiments, the system may simulate the high dose image using a without-iodine image as input for some patients, and may simulate the high dose image using a low-dose iodine image as input for other patients, e.g., patients for whom there is an a priori reason (e.g., a certain range of breast denseness) to be concerned that a synthetic high-dose image generated based only on a without-iodine image will be insufficiently accurate.
Many variations to what is specifically described above may be provided. For example, in addition to, or alternatively to, all or any subset of the operations of Fig. 2 which may be used to preprocess images in a dataset including with-iodine images and without-iodine images (or low- and high-dose iodine images), preprocessing may include all or any subset of the following Preliminary Image Processing operations e.g. if images have disparate sources (different machines, different technicians) and/or have undergone preliminary processing (may be preserved as source images after processing) and/or if images have disparate sizes: a. Histogram Equalization after normalization based on internal DICOM tags b. Histogram equalization after normalization (typically excluding background) and sigmoid function c. Cutting or partitioning images into patches: Low Energy and/or CESM images may be cut into patches which may be square. Patches may all be the same standardized
size e.g., say, 800x800 pixels. Each patch may be centered around a single lesion within the image. d. Creating masks (“lesion masks”) for at least one lesion location; typically each mask is the same size as a patch associated with the lesion location e. expand lesion mask proportionally to lesion size, to ensure the mask covers not only the lesion but also the lesion’s margins and/or enlarge lesion mask proportionally based on the lesion size e.g. to include not only the lesion and also the lesion’s margins. f. Creating mask/s for tissue region /s and/or for lesion location/s. g.Any suitable format or data type may be used for saving e.g. saving as uint8, uintl6, float.
In the context of, say, Preliminary Image Processing, the input image (e.g. of Fig. 1 or Fig. 3) may comprise Eow Energy patches multiplied by a mask, to cancel everything outside the lesion area. “Multiplication” may refer to matrix multiplication where the matrices comprise the multi-level pixel array of the input image and the binary pixel array of the mask respectively. The output image (e.g. of Fig. 1 or Fig. 3) may comprise CESM patches multiplied by the mask, canceling everything outside the lesion area. It is appreciated that the input image may comprise, say, a without-iodine Mammography image of a breast and the output image may comprise, say, a with-iodine Mammography image of the same breast. Or, the input image may comprise, say, a low-iodine-dose image of a breast and the output image may comprise, say, a high-iodine-dose image of the same breast.
It is appreciated that the systems and methods herein yield many advantages relative to physical generation of with-contrast images. For example, it is time-consuming (both for the medical professionals and for the patient) as well as costly to administer a contrast agent to the patient e.g., by mouth or injected intravenously. Contrast administration may require the patient to fast for a certain period of time. Also, after the procedure, patients may be required to drink 6 - 8 glasses of water within 24 hours to flush the contrast agent out of the body.
Also, CEMs expose patients to slightly more radiation than mammograms. Also, some patients have allergic reactions to intravenous contrast agents. Some of these reactions are severe, e.g., difficulty in breathing. Also, intravenous contrast agents sometimes affect patients’ kidneys. Also, and as a consequence, contrast is typically not used for patients with a history of serious allergic reactions to iodine, or for kidney patients. Contrast-enhanced mammography (CEM) is also considered unsafe for women who are either pregnant or breastfeeding.
Embodiments herein, by simulating contrast administration, provide images with greater clarity than no-contrast images, yet without the above drawbacks.
It is appreciated that terminology such as "mandatory", "required", "need" and "must" refer to implementation choices made within the context of a particular implementation or application described herewithin for clarity, and are not intended to be limiting, since, in an alternative implementation, the same elements might be defined as not mandatory and not required, or might even be eliminated altogether.
Components described herein as software may, alternatively, be implemented wholly or partly in hardware and/or firmware, if desired, using conventional techniques, and vice-versa. Each module or component or processor may be centralized in a single physical location or physical device, or distributed over several physical locations or physical devices.
Included in the scope of the present disclosure, inter alia, are electromagnetic signals in accordance with the description herein. These may carry computer-readable instructions for performing any or all of the operations of any of the methods shown and described herein, in any suitable order, including simultaneous performance of suitable groups of operations as appropriate. Included in the scope of the present disclosure, inter alia, are machine -readable instructions for performing any or all of the operations of any of the methods shown and described herein, in any suitable order; program storage devices readable by machine, tangibly embodying a program of instructions executable by the machine to perform any or all of the operations of any of the methods shown and described herein, in any suitable order i.e. not necessarily as shown, including performing various operations in parallel or concurrently, rather than sequentially, as shown; a computer program product comprising a computer useable medium having computer readable
program code, such as executable code, having embodied therein, and/or including computer readable program code for performing, any or all of the operations of any of the methods shown and described herein, in any suitable order; any technical effects brought about by any or all of the operations of any of the methods shown and described herein, when performed in any suitable order; any suitable apparatus or device or combination of such, programmed to perform, alone or in combination, any or all of the operations of any of the methods shown and described herein, in any suitable order; electronic devices each including at least one processor and/or cooperating input device and/or output device and operative to perform, e.g., in software, any operations shown and described herein; information storage devices or physical records, such as disks or hard drives, causing at least one computer or other device to be configured so as to carry out any or all of the operations of any of the methods shown and described herein, in any suitable order; at least one program pre-stored, e.g., in memory or on an information network such as the Internet, before or after being downloaded, which embodies any or all of the operations of any of the methods shown and described herein, in any suitable order, and the method of uploading or downloading such, and a system including server/s and/or client/s for using such; at least one processor configured to perform any combination of the described operations or to execute any combination of the described modules; and hardware which performs any or all of the operations of any of the methods shown and described herein, in any suitable order, either alone or in conjunction with software. Any computer-readable or machine -readable media described herein is intended to include non-transitory computer- or machine -readable media.
Any computations or other forms of analysis described herein may be performed by a suitable computerized method. Any operation or functionality described herein may be wholly or partially computer-implemented e.g., by one or more processors. The invention shown and described herein may include (a) using a computerized method to identify a solution to any of the problems or for any of the objectives described herein, the solution optionally including at least one of a decision, an action, a product, a service, or any other information described herein, that impacts, in a positive manner, a problem or objectives described herein; and (b) outputting the solution.
The system may, if desired, be implemented as a network- e.g., web-based system employing software, computers, routers, and telecommunications equipment, as appropriate.
Any suitable deployment may be employed to provide functionalities e.g., software functionalities shown and described herein. For example, a server may store certain applications, for download to clients, which are executed at the client side, the server side serving only as a storehouse. Any or all functionalities e.g., software functionalities shown and described herein, may be deployed in a cloud environment. Clients e.g., mobile communication devices such as smartphones, may be operatively associated with, but external to the cloud.
The scope of the present invention is not limited to structures and functions specifically described herein, and is also intended to include devices which have the capacity to yield a structure, or perform a function, described herein, such that even though users of the device may not use the capacity, they are, if they so desire, able to modify the device to obtain the structure or function.
Any “if -then” logic described herein is intended to include embodiments in which a processor is programmed to repeatedly determine whether condition x, which is sometimes true and sometimes false, is currently true or false, and to perform y each time x is determined to be true, thereby to yield a processor which performs y at least once, typically on an “if and only if’ basis, e.g., triggered only by determinations that x is true, and never by determinations that x is false.
Any determination of a state or condition described herein, and/or other data generated herein, may be harnessed for any suitable technical effect. For example, the determination may be transmitted or fed to any suitable hardware, firmware, or software module, which is known or which is described herein to have capabilities to perform a technical operation responsive to the state or condition. The technical operation may, for example, comprise changing the state or condition, or may more generally cause any outcome which is technically advantageous, given the state or condition or data, and/or may prevent at least one outcome which is disadvantageous, given the state or condition or data. Alternatively, or in addition, an alert may be provided to an appropriate human operator, or to an appropriate external system.
Features of the present invention, including operations which are described in the context of separate embodiments, may also be provided in combination in a single embodiment. For example, a system embodiment is intended to include a corresponding process embodiment, and vice versa. Also, each system embodiment is intended to include a server-centered “view” or client centered “view”, or “view” from any other node of the system, of the entire functionality of the system, computer-readable medium, or apparatus, including only those functionalities performed at that server or client or node. Features may also be combined with features known in the art, and particularly, although not limited to, those described in the Background section or in publications mentioned therein.
Conversely, features of the invention, including operations, which are described for brevity in the context of a single embodiment or in a certain order, may be provided separately or in any suitable sub-combination, including with features known in the art (particularly although not limited to those described in the Background section or in publications mentioned therein) or in a different order, "e.g." is used herein in the sense of a specific example which is not intended to be limiting. Each method may comprise all or any subset of the operations illustrated or described, suitably ordered, e.g., as illustrated or described herein.
Devices, apparatus, or systems shown coupled in any of the drawings may in fact be integrated into a single platform in certain embodiments, or may be coupled via any appropriate wired or wireless coupling, such as but not limited to optical fiber, Ethernet, Wireless LAN, HomePNA, power line communication, cell phone, Smart Phone (e.g., iPhone), Tablet, Laptop, PDA, Blackberry GPRS, Satellite including GPS, or other mobile delivery. It is appreciated that in the description and drawings shown and described herein, functionalities described or illustrated as systems and sub-units thereof can also be provided as methods and operations therewithin, and functionalities described or illustrated as methods and operations therewithin can also be provided as systems and sub-units thereof. The scale used to illustrate various elements in the drawings is merely exemplary and/or appropriate for clarity of presentation, and is not intended to be limiting.
Any suitable communication may be employed between separate units herein e.g., wired data communication and/or in short-range radio communication with sensors such as cameras, e.g., via WiFi, Bluetooth, or Zigbee.
It is appreciated that implementation via a cellular app as described herein is but an example, and, instead, embodiments of the present invention may be implemented, say, as a smartphone SDK, as a hardware component, as an STK application, or as suitable combinations of any of the above.
Any processing functionality illustrated (or described herein) may be executed by any device having a processor, such as but not limited to a mobile telephone, set-top-box, TV, remote desktop computer, game console, tablet, mobile, e.g., laptop or other computer terminal, embedded remote unit, which may either be networked itself (may itself be a node in a conventional communication network e.g.) or may be conventionally tethered to a networked device (to a device which is a node in a conventional communication network, or is tethered directly or indirectly/ultimately to such a node).
Any operation or characteristic described herein may be performed by another actor outside the scope of the patent application and the description is intended to include apparatus whether hardware, firmware, or software which is configured to perform, enable, or facilitate that operation or to enable, facilitate, or provide that characteristic.
The terms processor or controller or module or logic as used herein are intended to include hardware, such as computer microprocessors or hardware processors, which typically have digital memory and processing capacity, such as those available from, say Intel and Advanced Micro Devices (AMD). Any operation or functionality or computation or logic described herein may be implemented entirely or in any part, on any suitable circuitry, including any such computer microprocessor/s, as well as in firmware or in hardware, or any combination thereof.
It is appreciated that elements illustrated in more than one drawing, and/or elements in the written description, may still be combined into a single embodiment, except if otherwise specifically clarified herein. Any of the systems shown and described herein may be used to implement or may be combined with, any of the operations or methods shown and described herein.
It is appreciated that any features, properties, logic, modules, blocks, operations, or functionalities described herein, which are, for clarity, described in the context of separate embodiments, may also be provided in combination in a single embodiment, except where the specification or general knowledge specifically indicates that certain teachings are
mutually contradictory and cannot be combined. Any of the systems shown and described herein may be used to implement, or may be combined with, any of the operations or methods shown and described herein.
Conversely, any modules, blocks, operations, or functionalities described herein, which are, for brevity, described in the context of a single embodiment, may also be provided separately or in any suitable sub-combination, including with features known in the art. Each element, e.g., operation described herein, may have all characteristics and attributes described or illustrated herein, or, according to other embodiments, may have any subset of the characteristics or attributes described herein.
References herein to “said (or the) element x” having certain (e.g., functional or relational) limitations/characteristics, are not intended to imply that a single instance of element x is necessarily characterized by all the limitations/characteristics. Instead, “said (or the) element x” having certain (e.g. functional or relational) limitations/characteristics is intended to include both (a) an embodiment in which a single instance of element x is characterized by all of the limitations/characteristics and (b) embodiments in which plural instances of element x are provided, and each of the limitations/characteristics is satisfied by at least one instance of element x, but no single instance of element x satisfies all limitations/characteristics. For example, each time L limitations/characteristics are ascribed to “said” or “the” element X in the specification or claims (e.g. , to “said processor” or “the processor”), this is intended to include an embodiment in which L instances of element X are provided, which respectively satisfy the L limitations/characteristics, each of the L instances of element X satisfying an individual one of the L limitations/characteristics. The plural instances of element x need not be identical. For example, if element x is a hardware processor, there may be different instances of x, each programmed for different functions and/or having different hardware configurations (e.g., there may be 3 instances of x: two Intel processors of different models, and one AMD processor).
Claims
1. A method for providing decision support to human experts examining medical images, the method comprising: using a hardware processor to generate simulations of with-contrast images; and using an image display system to display the simulations of with-contrast images.
2. The method of claim 1 wherein the images comprise mammography images and the image display system comprises a mammography image display system or mammography monitor which may use DICOM Image Format.
3. The method of claim 2 wherein the mammography images comprise 2D mammography images.
4. The method of claim 2 wherein the mammography images comprise 3D mammography images.
5. The method of claim 1 wherein the with-contrast images are generated after first administering iodine to a subject whose body is being imaged.
6. The method of claim 5 and also comprising administering said iodine to the subject whose body is being imaged.
7. The method according to claim 1 wherein a generative neural network is used to generate said simulations.
8. The method according to claim 7 wherein the generative neural network is trained on image pairs including a first image of a given body portion which is not enhanced with any contrast agent, and a second image of the given body portion which is with-contrast, wherein the with-contrast image is captured by capturing an image of the body portion after a contrast agent, e.g., dye has been administered to the patient.
9. The method of claim 1 wherein the with-contrast image is captured by performing mammography on a patient to whom contrast dye has been administered.
10. A decision support system serving human experts examining medical images, the system comprising: a hardware processor generating simulations of with-contrast images; and a computer display receiving the simulations of with-contrast images and displaying the simulations to at least one human expert, thereby to provide decision support to the human expert.
11. The method of claim 1 wherein the images comprise ultrasound images.
12. The method of claim 1 wherein the images comprise CT images.
13. The method of claim 1 wherein the images comprise MRI images.
14. The method of claim 1 wherein the simulations of with-contrast images are generated computationally rather than by actually administering a contrast agent to a patient, and subsequently capturing an image of a body part of interest.
15. The method of claim 1 wherein the simulations of the with-contrast images are generated from without-contrast images.
16. The method of claim 1 wherein the simulations of the with-contrast images simulate images captured after administration of a first dosage D of a contrast agent, and wherein the simulations of the with-contrast images are generated from and/or trained on images captured after administration of a second dosage d < D of the contrast agent.
17. A system comprising at least one hardware processor configured to carry out the operations of any of the methods of claims 1 - 13.
18. A computer program product, comprising a non-transitory tangible computer readable medium having computer readable program code embodied therein, said computer readable program code adapted to be executed to implement a method for providing decision support to human experts examining medical images, the method comprising: using a hardware processor to generate simulations of with-contrast images; and using an image display system to display the simulations of with-contrast images.
19. The method of claim 1 wherein the images comprise tomography images.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US202363514682P | 2023-07-20 | 2023-07-20 | |
| US63/514,682 | 2023-07-20 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025017545A1 true WO2025017545A1 (en) | 2025-01-23 |
Family
ID=94281224
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/IL2024/050537 Pending WO2025017545A1 (en) | 2023-07-20 | 2024-05-30 | Mammography image display system, method and computer program product configured to display a synthetic image which simulates a with-contrast image |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2025017545A1 (en) |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20210241458A1 (en) * | 2017-10-09 | 2021-08-05 | The Board Of Trustees Of The Leland Stanford Junior University | Contrast Dose Reduction for Medical Imaging Using Deep Learning |
| WO2022129634A1 (en) * | 2020-12-18 | 2022-06-23 | Guerbet | Methods for training at least a prediction model for medical imaging, or for processing at least a pre-contrast image depicting a body part prior to an injection of contrast agent using said prediction model |
| US20220208355A1 (en) * | 2020-12-30 | 2022-06-30 | London Health Sciences Centre Research Inc. | Contrast-agent-free medical diagnostic imaging |
| US20220284584A1 (en) * | 2019-08-23 | 2022-09-08 | Oxford University Innovation Limited | Computerised tomography image processing |
-
2024
- 2024-05-30 WO PCT/IL2024/050537 patent/WO2025017545A1/en active Pending
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20210241458A1 (en) * | 2017-10-09 | 2021-08-05 | The Board Of Trustees Of The Leland Stanford Junior University | Contrast Dose Reduction for Medical Imaging Using Deep Learning |
| US20220284584A1 (en) * | 2019-08-23 | 2022-09-08 | Oxford University Innovation Limited | Computerised tomography image processing |
| WO2022129634A1 (en) * | 2020-12-18 | 2022-06-23 | Guerbet | Methods for training at least a prediction model for medical imaging, or for processing at least a pre-contrast image depicting a body part prior to an injection of contrast agent using said prediction model |
| US20220208355A1 (en) * | 2020-12-30 | 2022-06-30 | London Health Sciences Centre Research Inc. | Contrast-agent-free medical diagnostic imaging |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11937962B2 (en) | Systems and methods for automated and interactive analysis of bone scan images for detection of metastases | |
| US12380992B2 (en) | System and method for interpretation of multiple medical images using deep learning | |
| US11954902B2 (en) | Generalizable medical image analysis using segmentation and classification neural networks | |
| US11676281B2 (en) | 3-D convolutional neural networks for organ segmentation in medical images for radiotherapy planning | |
| US11756160B2 (en) | ML-based methods for pseudo-CT and HR MR image estimation | |
| US11615879B2 (en) | System and method for automated labeling and annotating unstructured medical datasets | |
| JP2025179127A (en) | Systems and methods for fast neural network-based image segmentation and radiopharmaceutical uptake determination | |
| US20200210767A1 (en) | Method and systems for analyzing medical image data using machine learning | |
| US20220318956A1 (en) | Sct image generation using cyclegan with deformable layers | |
| EP4018371B1 (en) | Systems and methods for accurate and rapid positron emission tomography using deep learning | |
| US20200372654A1 (en) | Sampling latent variables to generate multiple segmentations of an image | |
| US20230083657A1 (en) | Systems and methods for image evaluation | |
| US12299781B2 (en) | Deep reinforcement learning for computer assisted reading and analysis | |
| US11935232B2 (en) | Predicting disease progression from tissue images and tissue segmentation maps | |
| Velichko et al. | A comprehensive review of deep learning approaches for magnetic resonance imaging liver tumor analysis | |
| EP4388494A1 (en) | Object reconstruction in digital images | |
| WO2025017545A1 (en) | Mammography image display system, method and computer program product configured to display a synthetic image which simulates a with-contrast image | |
| Lee et al. | Lack of agreement between radiologists: implications for image-based model observers | |
| Mračko et al. | Enhancing Breast Microcalcification Classification: From Binary to Three-Class Classifier | |
| Zhou et al. | Semiautomated segmentation of breast tumor on automatic breast ultrasound image using a large-scale model with customized modules | |
| Garnavi et al. | Optimized weighted performance index for objective evaluation of border-detection methods in dermoscopy images | |
| Mahoro | Apprentissage profond et la radiomique pour la détection du cancer du sein | |
| Zhang et al. | A Foundation Model Framework for Multi-View MRI Classification of Extramural Vascular Invasion and Mesorectal Fascia Invasion in Rectal Cancer | |
| JP2025520268A (en) | Error-reducing readout by machine learning assisted alternative search suggestions | |
| CN120712053A (en) | Medical image generation method and device, artificial intelligence model learning method and device, and program |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24842552 Country of ref document: EP Kind code of ref document: A1 |