[go: up one dir, main page]

WO2024241277A1 - Système et procédé permettant de déterminer l'existence d'une anomalie physiologique dans un corps d'un sujet - Google Patents

Système et procédé permettant de déterminer l'existence d'une anomalie physiologique dans un corps d'un sujet Download PDF

Info

Publication number
WO2024241277A1
WO2024241277A1 PCT/IB2024/055047 IB2024055047W WO2024241277A1 WO 2024241277 A1 WO2024241277 A1 WO 2024241277A1 IB 2024055047 W IB2024055047 W IB 2024055047W WO 2024241277 A1 WO2024241277 A1 WO 2024241277A1
Authority
WO
WIPO (PCT)
Prior art keywords
dataset
subject
features
spatial features
body region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/IB2024/055047
Other languages
English (en)
Inventor
Larisa ADAMYAN
Kirill EFIMOV
Najeeb AYOUB
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Thermomind Global Inc
Original Assignee
Thermomind Global Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Thermomind Global Inc filed Critical Thermomind Global Inc
Publication of WO2024241277A1 publication Critical patent/WO2024241277A1/fr
Anticipated expiration legal-status Critical
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/01Measuring temperature of body parts ; Diagnostic temperature sensing, e.g. for malignant or inflamed tissue
    • A61B5/015By temperature mapping of body part
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0082Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes
    • A61B5/0091Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes for mammography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/43Detecting, measuring or recording for evaluating the reproductive systems
    • A61B5/4306Detecting, measuring or recording for evaluating the reproductive systems for evaluating the female reproductive systems, e.g. gynaecological evaluations
    • A61B5/4312Breast evaluation or disorder diagnosis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6887Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device

Definitions

  • the present invention relates to the field of systems and methods for thermographic screenings of a body of a subject, and more particularly, to systems and methods for determining an existence of a thermally detectable physiological abnormality in the body of the subject.
  • breast cancer screening remains a cornerstone of oncological healthcare due to its pivotal role in early detection and intervention, significantly influencing treatment outcomes and survival rates.
  • traditional screening methods such as mammography, ultrasound, and MRI, have undeniably contributed to reducing mortality rates, with the chances of a cure near 90% when detected and controlled in early stages.
  • Mammography although recognized as an effective screening tool for women over 40, demonstrates diminished sensitivity in detecting tumors within dense breast tissues, with rates falling to 47%, compared to 70-90% in the general population.
  • the search for novel, non-invasive breast cancer screening techniques is driven by the necessity to overcome the limitations of traditional modalities. This includes improving diagnostic accuracy, particularly in populations with dense breast tissue, and mitigating the health risks and accessibility issues associated with current methods, thereby enhancing the early detection and treatment of breast cancer.
  • thermography In response to these challenges, the field of medical imaging is witnessing revolutionary advancements, particularly through the integration of cutting-edge technologies and artificial intelligence.
  • medical thermography emerges as a promising contender to address these limitations. It represents a non-invasive adjunctive physiologic imaging technology that utilizes high-resolution infrared cameras and advanced computing to generate thermograms, revealing critical temperature variations on a patient's skin surface. It is a non-contact screening method, which does not involve radiation exposure or invasive procedures, and is safe for the patient and the technician performing the screening. While mammography and ultrasound depend primarily on structural and anatomical variation of the tumor from the surrounding breast tissue, thermography detects pathophysiologic changes within the breast such as metabolic and vascular changes caused by the cancer.
  • Some embodiments of the present invention may provide a thermal system for breast cancer screening without radiation and without body contact, which may include: a Thermal Device (TD) including: at least one frontal thermal camera including at least one thermal 3D sensor; at least one near infrared sensor (NIR); and at least one shutter for each sensor; at least two movable thermal cameras, including depth and NIR sensors each, positioned at a 45° angle from below and 45° angle from the side; at least one rail; at least one fan; a designated software; thermal cameras imaging systems constructed to use from 4D up to 8D models; wherein the sensors are either an integral part of the camera or a separate device connected to the camera; and wherein at least one camera is positioned on at least one rail of the TD, directed at the subject sitting on a chair in the center of the multiple angles; and wherein the cameras rotate 360° horizontally leaning on 90° up and down; wherein the TD has a degree of freedom allowing the adjustment to subject's individual characteristics; wherein the sensors capture synchronized thermal
  • the rail of top left and right cameras is rotating to the left or to the right in horizontal plane with respect to the direction of optical axis under angle (horizontal field of view) HFOV/2.
  • the cameras on the horizontal rail may be adopted to have upper boundary of field of view (FOV) on the neck of the screened subject.
  • FOV field of view
  • the cameras on the horizontal rail move according to screened subject's body structure so that the head is always out of the frame.
  • the rail angle guarantees that in every position of the camera the top boundary of the FOV is always on the neck level and never higher.
  • the same mechanism applies for lower cameras, when needed to rotate the cameras with respect to the optical axis, for lower cameras 450 + VFOV/2 angle.
  • the top left/right cameras with horizontal angle HFOV/2 achieve the same effect as for the head being always on the top border of the frame, when the chair is always on the left/right border of the frame.
  • the intrinsic parameters of the lens artefacts are determined by chessboard, a circle board, or any other calibrated target specifically tailored for thermal camera applications, being designed to stand out from the background in thermal imagery due to its emissivity and temperature and being maintained constantly for each individual lens.
  • the thermal device is being calibrated before every screening to get optimal thermal acquisition.
  • the thermal device is operated by Vision One (VO) system wherein the software platform having main central window showing the enlarged selected angle view and at least 5 small windows on the side, showing the screening process from each sensor on different angle views of the chest.
  • VO Vision One
  • the shutter is a mechanical device covering the camera's sensor controlling the duration of exposure to incoming radiation and helping to produce accurate temperature readings.
  • the thermal device includes a 4D thermal camera system constructed to use thermal and 3D data.
  • the thermal device includes a 5D thermal camera system, including: a) a long-wave infrared (LWIR) sensor configured to detect and capture long-wave infrared radiation emitted by objects within a scene (thermal sensor); b) a 3D depth sensor configured to measure the distance between the camera and objects within the scene based on the time it takes for emitted light pulses to return to the sensor after reflecting off said objects; c) a near-infrared (NIR) sensor configured to detect and capture near-infrared radiation reflected by objects within the scene; d) a video streaming module configured to process and transmit captured data from the LWIR, 3D, and NIR sensors in real-time, enabling live video streaming; e) an integration module configured to enable communication and data exchange between the 5D thermal camera and other external sensor systems as part of a multiple sensor system; wherein the combination of LWIR, 3D, and NIR sensors provides a comprehensive and dynamic fivedimensional representation of the scene.
  • LWIR long-wave infrare
  • the 5D imaging system includes an integrated machine learning module, wherein said module utilizes artificial intelligence algorithms for real-time object detection, classification, and tracking, enhancing the system's ability to recognize and analyze complex scenes, and facilitating various computer vision applications such as autonomous navigation, security and surveillance, and advanced human-computer interaction.
  • said module utilizes artificial intelligence algorithms for real-time object detection, classification, and tracking, enhancing the system's ability to recognize and analyze complex scenes, and facilitating various computer vision applications such as autonomous navigation, security and surveillance, and advanced human-computer interaction.
  • the 5D imaging system includes an adaptive fusion mechanism, wherein said mechanism dynamically adjusts the weights assigned to each imaging modality based on scene characteristics, ambient conditions, or specific application requirements, optimizing the output video stream for improved clarity, accuracy, and contextual information.
  • the 5D imaging system includes a modular design, allowing for the interchangeability and upgradability of individual sensor components or the addition of new imaging modalities, thereby enabling customization and adaptation of the system to meet specific use-case demands or to accommodate advances in sensor technology.
  • the processing unit of the 5D imaging system is configured to perform real-time image enhancement techniques, such as noise reduction, contrast stretching, and edge sharpening, on the LIR, NIR, and 3D sensor data prior to alignment and merging, thereby improving the quality and reliability of the resulting 5D video stream.
  • the 5D imaging system is utilized for non-invasive medical diagnostics, allowing healthcare professionals to visualize and analyze surface temperature variations and blood perfusion in the human body, aiding in the early detection of inflammation, infection, or other abnormalities.
  • the 5D imaging system is utilized for monitoring treatment response in cancer patients by not only providing comprehensive and non-invasive assessment of tumor characteristics, such as size, shape, and vascularization, but also evaluating various physiological parameters and biomarkers, including body temperature, blood perfusion, inflammation rates and metabolic activity by integrating data from the LIR, NIR, and 3D sensors with additional diagnostic information.
  • the 8D imaging system includes a RGB camera, that includes a LIR sensor for capturing long-range infrared images, a NIR sensor for capturing near-infrared images and a 3D sensor for obtaining depth information.
  • the RGB sensor captures visible light images, a processing unit configured to process and align the data acquired from LIR, NIR, 3D, and RGB sensors.
  • the 8D imaging system includes a communication interface transmitting the 8D video stream generated from the aligned and merged data of the LIR, NIR, 3D, and RGB sensors.
  • the 8D imaging system allows the enhanced scene understanding and analysis by combining the complementary imaging modalities into a single, coherent, and integrated video stream.
  • the 8D imaging system is utilized for installation on unmanned aerial vehicles (UAVs), allowing for enhanced remote monitoring and data collection in a variety of environments and applications.
  • UAVs unmanned aerial vehicles
  • the UAV -mounted 8D imaging system is employed for environmental monitoring, wildlife observation, search, rescue operations, and infrastructure inspection, providing comprehensive and multi-dimensional data that combines LIR, NIR, RGB and 3D information to enable more informed decision-making and improved outcomes in each respective field and improved visibility under varying conditions.
  • the thermal device includes an 8D thermal camera system constructed to use the dimensions of each type of data: 3 from color (RGB), 3 from the 3D structure, 1 from heat or LIR, and 1 from NIR.
  • the fusion of multiple imaging modalities of any one of claims 14 to 26 allows the system to perform well under different lighting conditions and in challenging environments, such as low-light, fog, or smoke, by leveraging the strengths of each imaging type.
  • Some embodiments of the present invention may provide a method for breast cancer screening by the TD without radiation and without body contact, the method may include the steps of: a. turning on the TD and waiting a few minutes for stabilization of cameras' temperature and uniformity of the image; b. performing a non-uniformity correction (NUC) in thermal cameras; c. inserting patient information (PI) into PI station; d. initializing the TD for screening; e.
  • NUC non-uniformity correction
  • calibrating the TD assisted by the software including: positioning each thermal sensor to its optimal distance from the screened subject and rotating the lens for optimal focus adjusted to the characteristics of each subject; f. displaying by the software a score ranging of each sensor from 0-100% assessing the current position and suggesting moving the cameras closer or farther from the subject, wherein the score is based on the size of the region of interest (ROI) in the frame, with the aim of displaying the full ROI and minimizing the area outside it; g. displaying by the software a score ranging of each sensor from 0-100% assessing the current lens position and suggesting the rotating of the lens for optimal focus; h. screening with TD; when the calibration is complete and all cameras are recognized by the software to be in the optimal position and focus, the screening starts; i.
  • ROI region of interest
  • NUC Non-Uniformity Correction
  • the calibration steps may be operated automatically by motors.
  • the shutter is closed twice, once to perform NUC and the second time, to ensure that the shutter is functioning by checking the noise distribution after NUC.
  • all images are registered, a process of aligning or matching images, establishing a relationship between two images of the same scene, whereas the first image in the sequence is the reference image, and the remaining images are transformable images.
  • the process of image registration aligns or matches images of different modalities, including but not limited to (LIR), NIR, and 3D images.
  • the step of preparation for the screening includes 3 phases in a row: acclimatization phase - capturing the calming of the body by the sensors and coming to the steadiness with the room temperature; cooling phase - by automatically switching on the fans mounted on the TD blowing air toward the chest area and cooling down the chest temperature by several degree Celsius, thus creating contrast to highlight some of the blood vessels; stabilization phase - the fans automatically switch off and the sensors capture the recovery of body temperature from cool state to normal state for further analyzing the recovery rates of different body areas and tissue types.
  • the method for compressing and encoding the 5D video stream generated by the imaging system, employs specialized algorithms and data structures designed to exploit redundancies and correlations within and between the different imaging modalities, resulting in a compressed video stream that retains essential information while reducing storage and transmission bandwidth requirements.
  • Some embodiments of the present invention may provide a system for determining an existence or absence of a physiological abnormality in a subject’s body region, which may include: one or more imaging devices configured to generate data representing the subject’s body region over a screening time, each of the one or more imaging devices includes an infrared sensor and a distance sensor; and a computing device configured to: based on the data, generate a digital model of the subject’s body region, the digital model including a plurality of time-sequential datasets representing a surface and spectral characteristics of the subject’s body region over the screening time; based on at least a portion of the digital model, determine at least one of spatial characteristics and temporal characteristics indicative of a presence or absence of data values in the digital model representing the physiological abnormality; and based on at least one of the spatial characteristics and temporal characteristics, determine at least one of a probability of and a biomarker indicative of the existence or absence of the physiological abnormality in the subject’s body region.
  • the computing device in order to determine the probability, is configured to: based on the plurality of time-sequential datasets, determine a plurality of spatial features vectors, each including a plurality of spatial features indicative of a presence or absence of data values representing the physiological abnormality in a corresponding dataset of the plurality of time-sequential datasets; based on the plurality of spatial features vectors, determine a temporal features vector including a plurality of temporal features indicative of a temporal dependency of the spatial features of the plurality of spatial features vectors over the screening time; and based on the temporal features vector, determine the probability.
  • the computing device in order to determine a spatial features vector of the plurality of spatial features vectors, is configured to provide a corresponding dataset of the plurality of time-sequential datasets as an input to a first machine learning model.
  • the spatial features vector is a one-dimensional embedded features vector generated by the first machine learning model through processing of the corresponding dataset.
  • the computing device in order to determine the temporal features vector, is configured to provide the plurality of spatial features vectors as an input to a second machine learning model.
  • the temporal features vector is a hidden state vector generated by the second machine learning model at a final timestep.
  • the computing device is configured to detect data values representing the physiological abnormalities in the digital model.
  • the computing device in order to detect the data values representing the physiological abnormalities in the digital model, is configured to provide a dataset of the plurality of time-sequential datasets as an input to a third machine learning model.
  • the computing device is configured to: combine an embedded features map generated by the third machine learning model through the processing the dataset with the temporal features vector into a combined embedded features map; and configuring the third machine learning model to continue processing of the combined embedded features map to detect the data values representing physiological abnormality in the dataset.
  • the computing device is configured to: based on subject’s personal information, calculate a personal information embedded vector; and combine the personal information embedded vector into the combined embedded features map prior to configuring the third machine learning model to continue processing of the combined embedded features map.
  • the computing device in order to determine the biomarker, is configured to: selecting a dataset of the plurality of time-sequential datasets; based on a dataset, detect a first sub dataset representing a first subregion and a second sub dataset representing a second subregion of the subject’s body region; based on the first sub dataset and the second sub dataset, calculate a first set of spatial features and a second set of spatial features, respectively; and based on the first set of spatial features and the second set of spatial features, calculate the biomarker indicative of the existence or absence of the physiological abnormality in the subject’s body region.
  • each of the first set of spatial features and the second set of spatial features includes at least one of an embedded features vector and a computer vision features vector calculated based on the respective first sub dataset or the second sub dataset.
  • the computing device in order to calculate the embedded features vector of each of the first set of spatial features and the second set of spatial features, is configured to provide the respective first sub dataset or the second sub dataset as an input to a machine learning model.
  • the embedded features vector of each of the first set of spatial features and the second set of spatial features is a one-dimensional embedded features vector generated by the machine learning model through processing of the respective first sub dataset or the second sub dataset.
  • the computer vision features vector of each of the first set of spatial features and the second set of spatial features includes at least one of mean, variance, skewness, contrast, homogeneity and entropy of data values of the respective first sub dataset or the second sub dataset.
  • the computing device in order to calculate the biomarker, is configured to provide the first set of spatial features vector and the second set of spatial features vector as an input to a machine learning model.
  • the biomarker is a thermal asymmetry (TAI) biomarker indicative of a thermal asymmetry between the first subregion and the second subregion of the subject’s body region, and wherein in order to calculate the TAI biomarker, the computing device is configured to: based on the dataset, calculate a mirrored dataset, the mirrored dataset being a mirrored representation of the dataset; register the dataset and the mirrored dataset with respect to each other; based on the dataset and the mirrored dataset, calculate a difference dataset representing a difference between the dataset and the mirrored dataset; and based on the difference dataset, detect the first sub dataset and the second sub dataset; wherein the dataset corresponds to an end of acclimatization phase and before a cooling phase of a screening procedure.
  • TAI thermal asymmetry
  • the biomarker is a thermal entropy score (TES) biomarker indicative of irregular thermal patterns and metabolic activity in the subject’s body region
  • the computing device in order to calculate the TES biomarker, is configured to: based on the first sub dataset and the second sub dataset, detect first data values representing hotspots in the first subregion and second data values representing hotspots in the second subregion of the subject’s body region, respectively; based on the first data values and the second data values, calculating a first hotspots features vector and a second hotspots features vector, respectively; and including the first hotspots features vector and second hotspots features vector in the first set of spatial features and the second set of spatial features, respectively; wherein the dataset corresponds to an end of acclimatization phase and before a cooling phase of a screening procedure.
  • TES thermal entropy score
  • each of the first hotspot features vector and the second hotspot features vector includes at least one of a number of hotspots, a mean size of hotspots, a mean measure of deviation of hotspots’ shape from their best- fit ellipse shape and a mean temperature difference between the hotspots and the hotspots’ surrounding in the respective first subregion or the second subregion.
  • the biomarker is a dynamic anomaly score (DAS) biomarker indicative of a physiological response of the subject’s body region to an induced thermal stress
  • DAS dynamic anomaly score
  • the computing device is configured to: selecting a second dataset of the plurality of time-sequential datasets; based on the dataset and the second dataset , calculate a difference dataset, the difference dataset representing a difference between the dataset and the second dataset; and based on the difference dataset; detect the first sub dataset and the second sub dataset; wherein the dataset corresponds to an end of an acclimatization phase and before a cooling phase of a screening procedure and the second dataset corresponds to an end of a post thermal stress stabilization phase of the screening procedure.
  • DAS dynamic anomaly score
  • the biomarker is a vascular activity indicator (VAI) biomarker indicative of irregular vessels network patterns in the subject’s body region
  • VAI vascular activity indicator
  • the computing device in order to calculate the VAI biomarker, is configured to: based on the dataset, calculate a vascular map dataset representing vessels in the subject’s body region; and based on the vascular map dataset: detect the first sub dataset and the second sub dataset; and calculate the first set of spatial features and the second set of spatial features; wherein the dataset corresponds to an end of an acclimatization phase and before a cooling phase of a screening procedure.
  • VAI vascular activity indicator
  • each of first set of spatial features and the second set of spatial features includes at least one of a total length, a perimeter, a number of intersection points, a fractal dimension and a median tortuosity of vessels in the respective first subregion or the second subregion.
  • the computing device is configured to: based on the first set of spatial features, calculate a first biomarker indicative of the existence or absence of the physiological abnormality in the first subregion of the subject’s body region; and based on the second set of spatial features, calculate a second biomarker indicative of the existence or absence of the physiological abnormality in the second subregion of the subject’s body region.
  • the one or more imaging devices includes a plurality of imaging devices configured to generate the data representing the subject’s body region from a plurality of points of view.
  • the data includes a plurality of time-sequential sensor datasets
  • the computing device is configured to: register the plurality of time-sequential sensor datasets in a reference coordinate system; based on the registration, combine the plurality of time-sequential sensor datasets into a three-dimensional digital model, the three-dimensional digital model including a plurality of time-sequential datasets each including a plurality of data values representing in three dimensions in the reference coordinate system the surface and the thermal characteristics of the subject’s body; and modify the data values of the plurality of time-sequential datasets of the three-dimensional digital model to represent the surface and the thermal characteristics of the subject’s body in two dimensions in the reference coordinate system.
  • the physiological abnormality is breast cancer and the subject’s body region is subject’s breast.
  • Some embodiments of the present invention may provide a method of determining an existence or absence of a physiological abnormality in a subject’s body region, the method may include, using a computing device operating a processor: receiving a plurality of time-sequential datasets, each including a plurality of data values representing a surface and spectral characteristics of the subject’s body region at a certain timestep in a screening time; based on the plurality of time-sequential datasets, calculating a plurality of spatial features vectors, each including a plurality of spatial features indicative of a presence or absence of data values representing the physiological abnormality in a corresponding dataset of the plurality of time- sequential datasets; based on the plurality of spatial features vectors, calculating a temporal features vector including a plurality of temporal features indicative of a temporal dependency of the spatial features of the plurality of spatial features vectors over the screening time; and based on the temporal features vector, calculating a probability of the existence or absence of the physiological abnormality in the
  • the method in order to calculating a spatial features vector of the plurality of spatial features vectors, includes providing a corresponding dataset of the plurality of time-sequential datasets as an input to a first machine learning model.
  • the spatial features vector is a one-dimensional embedded features vector generated by the first machine learning model through processing of the corresponding dataset.
  • the method in order to calculating the temporal features vector, includes providing the plurality of spatial features vectors as an input to a second machine learning model.
  • the temporal features vector is a hidden state vector generated by the second machine learning model at a final timestep.
  • the method includes detecting data values representing the physiological abnormalities in a dataset of the plurality of time-sequential datasets. [0070] In some embodiments, in order to detect the data values representing the physiological abnormalities in the dataset, the method includes providing the dataset as an input to a third machine learning model.
  • the method in order to detect the data values representing the physiological abnormalities in the dataset, includes: combining an embedded features map generated by the third machine learning model through the processing the dataset with the temporal features vector into a combined embedded features map; and configuring the third machine learning model to continue processing of the combined embedded features map to detect the data values representing physiological abnormality in the dataset.
  • the method in order to detect the data values representing the physiological abnormalities in the dataset, includes: based on subject’s personal information, calculating a personal information embedded vector; and combining the personal information embedded vector into the combined embedded features map prior to configuring the third machine learning model to continue processing of the combined embedded features map.
  • the physiological abnormality is breast cancer and the subject’s body region is subject’s breast.
  • Some embodiments of the present invention may provide a method of calculating a biomarker indicative of an existence or absence of a physiological abnormality in a subject’s body region, the method may include, using a computing device operating a processor: receiving a plurality of time-sequential datasets, each including a plurality of data values representing a surface and spectral characteristics of the subject’s body region at a certain timestep in a screening time; selecting a dataset of the plurality of time-sequential datasets; based on the dataset, detecting: a first sub dataset including data values of the plurality of data values of the dataset representing a first subregion of the subject’s body region; and a second sub dataset including data values of the plurality of data values of the dataset representing a second subregion of the subject’s body region; based on the first sub dataset, calculating a first set of spatial features of the first subregion of the subject’s region; based on the second sub dataset, calculating a second set of spatial features of the second subregion
  • the method in order to calculate the embedded features vector of each of the first set of spatial features and the second set of spatial features, includes providing the respective first sub dataset or the second sub dataset as an input to a machine learning model.
  • the embedded features vector of each of the first set of spatial features and the second set of spatial features is a one-dimensional embedded features vector generated by the machine learning model through processing of the respective first sub dataset or the second sub dataset.
  • the computer vision features vector of each of the first set of spatial features and the second set of spatial features includes at least one of mean, variance, skewness, contrast, homogeneity and entropy of data values of the respective first sub dataset or the second sub dataset.
  • the method in order to calculate the biomarker, includes providing the first set of spatial features vector and the second set of spatial features vector as an input to a machine learning model.
  • the method includes: based on the first set of spatial features, calculating a first biomarker indicative of the existence or absence of the physiological abnormality in the first subregion of the subject’s body region; and based on the second set of spatial features, calculating a second biomarker indicative of the existence or absence of the physiological abnormality in the second subregion of the subject’s body region.
  • the biomarker is a thermal asymmetry (TAI) biomarker indicative of a thermal asymmetry between the first subregion and the second subregion of the subject’s body region
  • TAI thermal asymmetry
  • the method includes: based on the dataset, calculating a mirrored dataset, the mirrored dataset being a mirrored representation of the dataset; registering the dataset and the mirrored dataset with respect to each other; based on the dataset and the mirrored dataset, calculating a difference dataset, the difference dataset including a plurality of data values representing a difference between the dataset and the mirrored dataset; and based on the difference dataset, detecting the first sub dataset and the second sub dataset; wherein the dataset corresponds to an end of acclimatization phase and before a cooling phase of a screening procedure.
  • TAI thermal asymmetry
  • the biomarker is a thermal entropy score (TES) biomarker indicative of irregular thermal patterns and metabolic activity in the subject’s body region
  • the method includes: based on the first sub dataset, detecting first data values representing hotspots in the first subregion of the subject’s body; based on the second sub dataset, detecting second data values representing hotspots in the second subregion of the subject’s body; based on the first data values, calculating a first hotspots features vector; based on the second data values, calculating a second hotspots features vector; and including the first hotspots features vector and second hotspots features vector in the first set of spatial features and the second set of spatial features, respectively; wherein the dataset corresponds to an end of acclimatization phase and before a cooling phase of a screening procedure.
  • TES thermal entropy score
  • each of the first hotspot features vector and the second hotspot features vector includes at least one of a number of hotspots, a mean size of hotspots, a mean measure of deviation of hotspots’ shape from their best- fit ellipse shape and a mean temperature difference between the hotspots and the hotspots’ surrounding in the respective first subregion or the second subregion.
  • the biomarker is a dynamic anomaly score (DAS) biomarker indicative of a physiological response of the subject’s body region to an induced thermal stress
  • DAS dynamic anomaly score
  • the method including: selecting a second dataset of the plurality of time-sequential datasets; based on the dataset and the second dataset, calculating a difference dataset, the difference dataset including a plurality of data values representing a difference between the dataset and the second dataset; and based on the difference dataset; detecting the first sub dataset and the second sub dataset; wherein the dataset corresponds to an end of an acclimatization phase and before a cooling phase of a screening procedure and the second dataset corresponds to an end of a post thermal stress stabilization phase of the screening procedure.
  • DAS dynamic anomaly score
  • the biomarker is a vascular activity indicator (VAI) biomarker indicative of irregular vessels network patterns in the subject’s body region
  • VAI vascular activity indicator
  • the method includes: based on the dataset, calculating a vascular map dataset representing vessels in the subject’s body region; and based on the vascular map dataset: detecting the first sub dataset and the second sub dataset; and calculating the first set of spatial features and the second set of spatial features; wherein the dataset corresponds to an end of an acclimatization phase and before a cooling phase of a screening procedure.
  • VAI vascular activity indicator
  • each of first set of spatial features and the second set of spatial features includes at least one of a total length, a perimeter, a number of intersection points, a fractal dimension and a median tortuosity of vessels in the respective first subregion or the second subregion.
  • the physiological abnormality is breast cancer and the subject’s body region is subject’s breast.
  • FIG. 1 is an illustration of a screening station with a chair, where the screened subject sits, according to some embodiments of the invention.
  • Fig. 2 is a block diagram of a screening thermal device, according to some embodiments of the invention.
  • Fig. 3 is an illustration of the thermal device, according to some embodiments of the invention.
  • Fig. 3A is a closeup of a thermal camera including a sensor, according to some embodiments of the invention.
  • Fig. 3B is an exploded-view of a 3D camera-sensor, according to some embodiments of the invention.
  • FIG. 4 shows schematic illustrations of distributions of cancer locations in breast regions
  • FIG. 5 shows schematic illustrations of breast with ptosis
  • Fig. 6 shows schematic illustrations of image quality for different areas of the breast and for different camera angles, according to some embodiments of the invention.
  • Fig. 7 shows schematic illustrations of an un-distortion correction, according to some embodiments of the invention.
  • FIG. 8 is a diagram of a subject information station and screening system initialization flow, according to some embodiments of the invention.
  • FIG. 9 is a flowchart of a thermal screening general flow, according to some embodiments of the invention.
  • FIG. 10 shows software platform interface for the operation and screening of TD using the TM VO system, according to some embodiments of the invention
  • Fig. 11 shows sensor positioning, according to some embodiments of the invention.
  • Fig. 11A shows a sensor focus by rotating the lens, according to some embodiments of the invention.
  • Fig. 12 shows an Artificial Intelligence (Al) model architecture, according to some embodiments of the invention.
  • Fig. 13B shows a camera far from subject, horizontal rail, head is visible, according to some embodiments of the invention.
  • Fig. 13C shows a camera close to subject, rail under angle, head is not visible, according to some embodiments of the invention.
  • FIG. 14 is a block diagram of a system for determining an existence or absence of a physiological abnormality in a subject’s body region, according to some embodiments of the invention.
  • FIG. 15A shows a computing device for generating a digital model representing the subject’s body region based on data from imaging devices of system, according to some embodiments of the invention
  • Fig. 15B shows a block diagram of 2D digital model of the subject’s body region and illustrations of a dataset of the plurality of time-sequential datasets of 2D digital model, according to some embodiments of the invention
  • Fig. 16 shows a computing device for calculating a probability of the existence or absence of the physiological abnormality and/or detecting data values of the digital model representing the physiological abnormality in the subject’s body region, according to some embodiments of the invention
  • Fig. 17A shows a computing device for calculating a thermal asymmetry index (TAI) biomarker indicative of the existence or absence of the physiological abnormality in the subject’s body region, according to some embodiments of the invention
  • Fig. 17B is an illustration of difference dataset for calculation of TAI biomarker, wherein difference dataset represents the subject’s breast region, according to some embodiments of the invention.
  • Fig. 18 shows a computing device for calculating a thermal entropy score (TES) biomarker indicative of the existence or absence of the physiological abnormality in the subject’s body region, according to some embodiments of the invention
  • FIG. 19B is an illustration of vessels map dataset for calculation of VAI biomarker, wherein vessels map dataset represents the subject’s breast region, according to some embodiments of the invention.
  • Fig. 20B is an illustration of difference dataset for calculation of DAS biomarker, wherein difference dataset 814 represents the subject’s breast region, according to some embodiments of the invention.
  • Figs. 21A, 21B, 21C, 21D and 21E are illustrations of results of five validation cases of calculated TAI, TES, VAI and DAS biomarkers, according to some embodiments of the invention.
  • Fig. 22 is a flowchart of a method of generating the digital model representing the subject’s body region, according to some embodiments of the invention.
  • Fig. 23 is a flowchart of a method of determining an existence or absence of a physiological abnormality in a subject’s body region, according to some embodiments of the invention.
  • Fig. 24 is a flowchart of a method of calculating a biomarker indicative of an existence or absence of a physiological abnormality in a subject’s body region, according to some embodiments of the invention;
  • Fig. 25 is a flowchart of a method of calculating the thermal asymmetry index (TAI) biomarker indicative of the existence or absence of the physiological abnormality in the subject’s body region, according to some embodiments of the invention.
  • TAI thermal asymmetry index
  • Fig. 26 is a flowchart of a method of calculating the thermal entropy score (TES) biomarker indicative of the existence or absence of the physiological abnormality in the subject’s body region, according to some embodiments of the invention.
  • TES thermal entropy score
  • Fig. 27 is a flowchart of a method of calculating a vascular activity indicator (VAI) biomarker indicative of the existence or absence of the physiological abnormality in the subject’s body region, according to some embodiments of the invention.
  • VAI vascular activity indicator
  • Embodiments of the present invention may provide a thermal system for breast cancer screening without radiation and without body contact which may serve the diagnosis of any disease which potentially affects metabolic and vascular functionality with the body, (e.g., inflammation).
  • the thermal system and method, herein described, is designed, in some embodiments, for breast cancer screening without ionizing radiation and without body contact.
  • the main advantages of the system are:
  • a robust multi-modal imaging system is employed leveraging an array of cameras and sensors to capture a comprehensive suite of data types, such as Near-Infrared (NIR), thermal or Long Wave Infrared (LWIR), three-dimensional (3D), and RGB imagery, all from various perspectives.
  • NIR Near-Infrared
  • LWIR Long Wave Infrared
  • 3D three-dimensional
  • RGB imagery all from various perspectives.
  • This system is built around an assembly of five thermal cameras that primarily capture high-resolution LWIR data.
  • integrated 3D Time-of-Flight cameras contribute detailed 3D images and NIR data.
  • optional RGB cameras can be utilized, which are seamlessly integrated with the 3D cameras, providing invaluable information from the visual spectrum.
  • This comprehensive and integrated approach offers precise depth assessment of the breast and potential areas of concern while capturing thermal and NIR imaging data.
  • This setup of using multiple cameras covering the entire area from multiple angles, allows optimal thermal screening without repositioning the subject during the screening while offering a comprehensive view of the breast's thermal, 3D and
  • the cameras move from or to the subject, to cover a specific Region of Interest (ROI) based on subject's characteristics. This allows a high-resolution skin coverage., e.g., if the subject is slim, the cameras move forward, if the subject is large, the cameras will move backward to have an optimal distance.
  • ROI Region of Interest
  • the thermal system is designed to overcome some of the problems encountered in the prior art. For example, minimizing noise in the image; obtaining accurate 3D measurements and optimal coverage of the ROI from multiple angles during the entire dynamic screening, and ensuring a high resolution of the skin thermal image, meaning that the number of pixels per square cm of skin is high.
  • One of the methods for scanning the subject would be to move one or multiple cameras around the subject to record the skin temperature distribution from different angles. This approach is beneficial as it allows the use of a smaller number of cameras and enables the understanding of the 3D structure of the object from the collected multi-angle data.
  • the camera movement may be performed in two ways.
  • the first way is to move the camera continuously around the subject back and forth.
  • the main drawback of this method is the motion blur effect which reduces the quality of the images collected. For this reason, a setup with fixed cameras is more beneficial.
  • the second way is to overcome the above effect, by moving the camera and making several stops around the subject. For example, instead of using 5 fixed cameras, it is possible to move one camera from one position to another and wait at each position for 1 second to collect unblurred data, and then move to the next stop. Assuming that it is possible to move the camera fast enough from one position to another in 1 second and spending 1 second at each position out of five, it would require 9 seconds (5 sec + 4 sec) to move the camera from left to right. Then moving the camera back into initial position requires another 4 seconds. So, one cycle would take 13 seconds.
  • the depth information that is provided by 3D structure measurements is used to define the borders of the breast, making it easier to differentiate more clearly between breast tissue and surrounding tissue. This can lead to more accurate segmentation results and a better understanding of the breast tissue overall.
  • the 3D structure aids in determining the depth of the tumor.
  • the curvature of the area where the spot is located is considered to play a crucial role in determining the depth of the tumor.
  • Emissivity of the skin is not uniform in all directions. With 3D information it is possible to determine the skin orientation and make necessary corrections to the measured thermal signal.
  • NIR imaging provides valuable information in addition to Long Wave Infrared (LWIR) imaging and aids in the detection of vessel structures which are crucial for breast cancer detection. There are several reasons why NIR images help to detect vessel structures and improve breast cancer detection:
  • NIR imaging takes advantage of the optical window in the biological tissue where light absorption is relatively low, generally in the wavelength range of 650 to 900 nm (known as the first NIR window or NIR-I). This allows NIR light to penetrate deeper than visible light, making it possible to visualize structures beneath the skin.
  • the specific absorption characteristics of different tissues within this range make it easier to distinguish between them and create a contrast effect.
  • the two primary absorbers of NIR light in tissue are oxygenated and deoxygenated hemoglobin in the blood, and water.
  • the NIR wavelength range is less prone to scattering than visible light, resulting in clearer images of the breast tissue and vessels. This improves image quality and leads to more accurate detection and localization of abnormalities, such as cancerous tumors.
  • Multispectral analysis Combining NIR and LWIR imaging allowing for a more comprehensive analysis of the breast tissue.
  • the LWIR imaging provides information on temperature variations, which may indicate an increased metabolic activity in cancerous tissue, while NIR imaging reveals the presence of abnormal vessel structures.
  • the combination of these two imaging modalities improves the specificity and sensitivity of breast cancer detection.
  • NIR imaging alongside 3D structure measurements and thermal imaging significantly enhances the detection of vessel structures and the overall accuracy of breast cancer detection.
  • the frontal camera 103 including sensor 203 captures high resolution images of the frontal side of the breast, but the left, right and lower sides are not visible for cancer location distribution (see Fig. 3).
  • TD 100 there are at least 2 cameras 102 & 104 (including sensor 203) positioned facing the right and left breasts from 45° degree angle below main frontal camera 103, (see Fig. 3).
  • the system includes a Thermal Device (TD) 100 that comprises frontal camera 103 including 3D thermal depth sensor 203, two cameras 102 & 104 including depth sensors each 203, positioned at a 45°-degree angle from below and 45°-degree angle from the side, two cameras 101 & 105 including depth sensors 203 each, positioned at 80°-90°-degree angles from the sides on the same level as frontal camera 103, and fans 106.
  • TD Thermal Device
  • the TD comprises distinct combination of sensors 101-105, capable of acquiring data ranging from 4-dimensional to 8-dimensional.
  • the 8-dimension comes from adding up the dimensions of each type of data: 3 from color (RGB), 3 from the 3D structure, 1 from heat or LWIR, and 1 from NIR.
  • RGB color
  • LWIR heat or LWIR
  • NIR NIR
  • a 5D thermal camera system comprising: a) a long-wave infrared (LWIR) sensor configured to detect and capture long-wave infrared radiation emitted by objects within a scene. b) a 3D depth sensor configured to measure the distance between the camera and objects within the scene based on the time it takes for emitted light pulses to return to the sensor after reflecting off said objects. c) a near-infrared (NIR) sensor configured to detect and capture near-infrared radiation reflected by objects within the scene. d) a video streaming module configured to process and transmit captured data from the LWIR, 3D, and NIR sensors in real-time, enabling live video streaming.
  • LWIR long-wave infrared
  • NIR near-infrared
  • the 5D imaging system further comprises an integrated machine learning module, wherein said module utilizes artificial intelligence algorithms for real-time object detection, classification, and tracking, enhancing the system's ability to recognize and analyze complex scenes, and facilitating various computer vision applications such as autonomous navigation, security and surveillance, and advanced human- computer interaction.
  • the 5D imaging system further comprises an adaptive fusion mechanism, wherein said mechanism dynamically adjusts the weights assigned to each imaging modality based on scene characteristics, ambient conditions, or specific application requirements, optimizing the output video stream for improved clarity, accuracy, and contextual information.
  • the 5D imaging system further comprises a modular design, allowing for the interchangeability and upgradability of individual sensor components or the addition of new imaging modalities, thereby enabling customization and adaptation of the system to meet specific use-case demands or to accommodate advances in sensor technology.
  • the processing unit of 5D imaging system is further configured to perform real-time image enhancement techniques, such as noise reduction, contrast stretching, and edge sharpening, on the LWIR, NIR, and 3D sensor data prior to alignment and merging, thereby improving the quality and reliability of the resulting 5D video stream.
  • real-time image enhancement techniques such as noise reduction, contrast stretching, and edge sharpening
  • the 5D imaging system is utilized for non-invasive medical diagnostics, allowing healthcare professionals to visualize and analyze surface temperature variations and blood perfusion in the human body, aiding in the detection of inflammation, infection, or other abnormalities.
  • the 5D imaging system is utilized for monitoring treatment response in cancer patients by not only providing comprehensive and non-invasive assessment of tumor characteristics, such as size, shape, and vascularization, but also evaluating various physiological parameters and biomarkers, including body temperature, blood perfusion, inflammation rates and metabolic activity by integrating data from the LWIR, NIR, and 3D sensors with additional diagnostic information.
  • This comprehensive monitoring approach enables healthcare professionals to track and evaluate the effectiveness of therapeutic interventions, monitor patients' overall health status, and adjust treatment plans accordingly for optimized patient outcomes.
  • the method of compressing and encoding the 5D video stream generated by the imaging system employs specialized algorithms and data structures designed to exploit redundancies and correlations within and between the different imaging modalities, resulting in a compressed video stream that retains essential information while reducing storage and transmission bandwidth requirements.
  • a communication interface is transmitting the 8D video stream generated from the aligned and merged data of the LWIR, NIR, 3D, and RGB sensors.
  • the 8D imaging system allows for enhanced scene understanding and analysis by combining the complementary imaging modalities into a single, coherent, and integrated video stream.
  • This system can be utilized in various applications, for example adapted for installation on unmanned aerial vehicles (UAVs), such as drones, allowing for enhanced remote monitoring and data collection in a variety of environments and applications.
  • UAVs unmanned aerial vehicles
  • a camera comprises a RGB sensor 201, capturing high-resolution images with accurate color reproduction, thereby representing the scene authentically.
  • the camera further incorporates a Time-of-Flight (ToF) depth module, consisting of a Near-Infrared (NIR) emitter (202), sensor (203), an optics system (204), and a computation unit (205).
  • NIR Near-Infrared
  • the NIR emitter transmits light, which, upon reflection from objects in the scene, is captured by the NIR sensor.
  • Optics system 204 focuses this reflected NIR light onto the sensor.
  • Computation unit 205 processes the time taken for this light to travel back and forth, thereby calculating distance measurements. This data enables the creation of a detailed 3D representation of the scene, while simultaneously producing a NIR image.
  • Thermal cameras (101-105) can produce far Infrared images. It detects infrared radiation to generate a 'heat map' of the scene, revealing temperature distribution and anomalies.
  • Signal Processing Electronics (208) amplifies the signal, reduces noise, and converts the signal to digital format for image creation.
  • Signal Processing Module (209) includes a computational module enabling simultaneous processing and merging of data streams from all sensors. This module synchronizes data, resulting in comprehensive, multi-dimensional scene representation and turns the output from all detectors into a form usable by external devices.
  • the camera's high-precision components are housed within a durable body 210, safeguarding against external elements, facilitating user-friendly operation, and ensuring accessibility for all user proficiency levels.
  • the 3D sensors 203 within cameras 101-105 are used.
  • the skin is scanned with high resolution over the entire breast.
  • TD 100 has a degree of freedom allowing it to adjust to the subject, based on her individual characteristics. This allows for a fixed thermal acquisition, thereby removing any bias from the technician who performs the screening.
  • TD 100 is easily operated by a certified technician/nurse during the screening process.
  • Thermal sensors capture synchronized thermal videos of subject’s chest area over time to analyze the thermal recovery rates of different areas and tissue types.
  • Sensors 203 enhance the accuracy of thermal signals processed by adding information about the depth of the signal and helping to reconstruct the 3D model of the thermal map.
  • the blood vessel modeling is enriched by incorporating near-infrared signal acquisition, which is aligned with thermal and depth signals.
  • the screening process includes 3 phases in a row:
  • the first minutes are for acclimatization, where the Sensors capture how the body is calming and coming to the steadiness with the room temperature.
  • the cooling phase which automatically switches on fans 106 mounted on TD 100.
  • Fans 106 blow air towards the body’s chest area to cool it down by several degree Celsius. This creates contrast to highlight some of the blood vessels.
  • the stabilization phase starts.
  • the Sensors capture the recovery of the body temperature from cool state to normal state to further analyze the recovery rates of different body areas and tissue types.
  • the captured signals including thermal, depth, and NIR streams are automatically uploaded to the system for further Artificial Intelligence (Al) analysis to find metabolic abnormalities related to breast cancer.
  • Al Artificial Intelligence
  • VFOV is a Vertical Field of View of thermal camera
  • HFOV is a Horizontal Field of View.
  • Sensors 203, and rails 107 are positioned on TD 100 in the following way:
  • optical axis of the top Sensors is horizontal.
  • optical axis of the bottom Sensors is under 45°.
  • This setup also guaranties that the woman's head is never visible from any possible position of the cameras. This is the only setup which achieved such a result with only one degree of freedom per camera.
  • Figs 13C-D demonstrate this effect for one top camera in comparison to situation in Figs. 13A-B when the rail is horizontal.
  • the camera may be adopted to have upper boundary of FOV on the neck of the screened subject. If the next screened subject is larger, the cameras move back to see the ROI. The problem is that the head is visible and since the head is outside the ROI, it is not desirable to spend pixels of the image on the head. Privacy concerns is another reason for aiming the ROI, so it does not include the head.
  • the system guarantees that the head of the screened subject is never visible from any position of the camera, preserving privacy and ensuring optimal ROI capturing for different body sizes with minimum number of movements of camera.
  • Lens might suffer from distortion effect (red and blue images), while the goal is to correct the distortion and get the green image, where all the lines are straight (see Fig.7). This eliminates lens artifacts and keeps each image of each camera uniform (see https://purveyoroflight.com/blog/correcting-lens-distortion-in-adobe-lightroom).
  • step 3 When turning on uncooled TD 100, it takes several minutes for the sensor’s temperature to stabilize (step 3). Once stabilized, the measurements drift is slow, and the non-uniformity of the image is stable for a long time. Moreover, if during screening, the environmental changes, it might affect stabilization. So, there is a need to measure it and compensate the change in real time. Therefore, stabilization measurement is required.
  • the stabilization measurement is achieved as follows: a. Predefined ROI, which is a known temperature marker with emissivity > 0.95 that was added to the device’s chair, is found in the image. b. The median digital level over X sec is measured. c. If the median is less than the stabilization threshold, then the camera is stable, otherwise a 1 -minute wait is required and then, the process is repeated. d. If there is a spike during the measurement (something moved in front of the camera) the measurement must be restarted or ignored.
  • NUC Non-Uniformity Correction
  • the slop is the gain of each pixel: digital level for temperature difference.
  • the offset is digital level at specific temperature.
  • the calibration gain and offset parameters of each camera are fed to the recording mechanism, to make sure the recording data of all cameras is equally normalized.
  • Step 1 inserting subject information (PI) into PI station.
  • eCRF is an electronic Case Report Form - meaning the subject information station 109;
  • Step 2 initializing TD 100 for the subject screening
  • Step 3 system Calibration
  • Step 4 screening with Thermal Device
  • Step 5 saving the results in cloud together with PI.TM cloud meaning cloud database.
  • the screening device needs calibration to get optimal thermal acquisition.
  • the technician adjusts the height of the chair to be sure that the head of the subject is positioned on the same predefined height from the floor.
  • the technician Before starting the screening, the technician does a calibration procedure. This makes sure the cameras are in the right position and focused properly. This is needed to achieve optimal position of ROI in the field of view and optimal focus of the image.
  • the calibration of the device consists of positioning each Sensor to its optimal distance from the screened subject and rotating the lens for optimal focus. It is done for optimal thermal screening acquisition, adjusted to the characteristics of each subject. The calibration is assisted by the software.
  • the user adjusts the focus of Sensors 203 of cameras 101-105 by rotating the lens of each camera in both directions.
  • the software guides the user to the best position of the lens for sharpness of the ROI.
  • the software signals a green light when the image is optimally sharp, (see fig 1 la).
  • the calibration steps may be operated automatically by motors.
  • NUC Non-Uniformity Correction
  • Shutter 210 is a mechanical device that covers the camera's sensor, controls the duration of exposure to incoming radiation, and helps to produce accurate temperature readings. During calibration shutter 210 is closed twice, once to perform NUC, and the second time, to ensure that shutter 210 is functioning by checking the noise distribution after NUC.
  • TM VO system The operation and screening of TD is performed by TM VO system.
  • the software platform has main central window and at least 5 small windows on the side.
  • the small windows show the screening process from each Sensor on the screening device, meaning different angle views of the chest.
  • the main window shows the enlarged selected angle view.
  • Registration is the process of aligning or "matching" images. In this case, registration is used to establish a relationship between two images of the same scene, for achieving the best possible overlap.
  • the first image in the sequence is considered the reference image, while the remaining images are considered as transformable images.
  • the registration is performed in two stages.
  • a rigid body registration is applied based on intensities with geometric transformation consisting of translation, rotation, and scale.
  • the second stage uses non-rigid 2D registration with Residual Complexity (RC), (see Liu, H., Zhang, J., Yang, K., Hu, X. and Stiefelhagen, R., 2022.
  • RC Residual Complexity
  • CMX Cross-Modal Fusion for RGB- X Semantic Segmentation with Transformers. arXiv preprint arXiv:2203.04838).
  • NMI Normalized Mutual Information
  • Each frame contains spatial information, and the sequence of those frames contains temporal information.
  • a hybrid architecture is used consisting of convolutions (for spatial processing) as well as recurrent layers (for temporal processing).
  • CNN Convolutional Neural Network
  • RNN Recurrent Neural Network
  • GRU Gated recurrent unit
  • the images of a video, along with the corresponding NIR images and 3D structure information, are fed to a CNN model to extract high-level features.
  • features are extracted from the vessel map derived from the thermal and NIR images. After concatenating the features from all 5 angles they are fed to an RNN layer and the output of the RNN layer is connected to a fully connected layer to get the classification output.
  • ResNetl8 pre-trained is used on ImageNet ( ⁇ 11 mln parameters) as the base CNN model and RNN model with hidden size 100 and two layers.
  • the hidden state from RNN model is then concatenated with features extracted from risk factors and then together are used to predict malignancy.
  • Embodiments of the present invention may improve determination of an existence or absence of a physiological abnormality (e.g., thermally detectable physiological abnormality) in a subject’s body region.
  • Some embodiments of the present invention may provide a system.
  • the system may include one or more imaging devices.
  • Each of the one or more computing devices may include at least an infrared (IR) sensor (e.g., thermal sensor) and a distance sensor.
  • IR infrared
  • the one or more imaging devices that may generate data representing the subject’s body region over a screening time.
  • the system may include a computing device. Based on the data from the one or more imaging devices, the computing device may generate a digital model of the subject’s body region.
  • the digital model may include a plurality of time-sequential datasets representing a surface and IR (e.g., thermal) characteristics of the subject’s body region over the screening time. Based on at least a portion of the digital model, the computing device may determine (e.g., calculate) spatial features (e.g., spatial characteristics) and/or temporal features (e.g., temporal characteristics) indicative of a presence or absence of data values in the digital model representing the physiological abnormality. Based on the spatial features and/or temporal features, the computing device may determine (e.g., calculate) a probability of and/or one or more biomarkers indicative of the existence or absence of the physiological abnormality in the subject’s body region.
  • spatial features e.g., spatial characteristics
  • temporal features e.g., temporal characteristics
  • FIG. 14 is a block diagram of a system 200 for determining an existence or absence of a physiological abnormality (such as breast cancer) in a subject’s body region (such as subject’s breast), according to some embodiments of the invention.
  • a physiological abnormality such as breast cancer
  • imaging devices 210 may be positioned like thermal cameras 101-105 of thermal device 100 (e.g., as described above with respect to Fig. 1, 3, and 13A-13D) to ensure the all-encompassing coverage of the breast from multiple angles, including the lower quadrants and axillary regions of the breast usually not visible from frontal and side views (e.g., as described here above).
  • Each of imaging devices 210 may include an infrared (e.g., thermal) sensor 211.
  • IR (e.g., thermal) sensor 211 may be sensitive to light in a wavelength range of 3 pm to 14 pm.
  • IR (e.g., thermal) sensor 211 may acquire (e.g., capture) time-sequential IR (e.g., thermal) datasets (e.g., IR (e.g., thermal) images) of the subject’s body region.
  • Each of imaging devices 210 may include a distance sensor 212.
  • Distance sensor 212 may acquire time-sequential multipoint distance datasets (e.g., time-sequential point clouds) each being indicative of a distance to a surface of the subject’s body region.
  • each of imaging devices 210 includes a near infrared (NIR) sensor 213.
  • NIR sensor 213 may be sensitive to light in a wavelength range of 700 tol400 nm (e.g., 800 to 1000 nm, e.g. 940 nm).
  • NIR sensor 213 may acquire time-sequential NIR datasets (e.g., NIR images) of the subject’s body region.
  • each of imaging devices 210 includes an RGB sensor 214.
  • RGB sensor 214 may be sensitive to visible light in the wavelength range of 400 to 700 nm.
  • RGB sensor 214 may acquire time-sequential RGB datasets (e.g., RGB images) of the subject’s body region.
  • each of imaging devices 210 includes IR (e.g., thermal) sensor 211 and distance sensor 212 (e.g., without NIR sensor 213 and RGB sensor 214). In some embodiments, each of imaging devices 210 includes IR (e.g., thermal) sensor 211, distance sensor 212 and NIR sensor 213 (e.g., without RGB sensor 214). In some embodiments, each of imaging devices 210 includes IR (e.g., thermal) sensor 211, distance sensor 212, NIR sensor 213 and RGB sensor 214.
  • IR e.g., thermal
  • distance sensors 212 distance sensors 212
  • NIR sensors 213 and RGB sensors 214 are shown as part of imaging devices 210, in various embodiments, IR (e.g., thermal) sensors 211, distance sensors 212, NIR sensors 213 and/or RGB sensors 214 may be provided as standalone sensors.
  • the data acquired by imaging devices 210 may represent the surface of the subject’s body region and spectral characteristics of the subject’s body region.
  • the spectral characteristics may include IR (e.g., thermal) characteristics of the subject’s body region.
  • the IR (e.g., thermal) characteristics may include temperature distribution on the surface of the subject’s body region.
  • thermally detectable physiological characteristics of the subject’s body region may be detected and analyzed, for example to determine the existence or absence of physiological abnormalities in the subject’s body region. For example, areas with inflammation or infection may exhibit elevated temperatures (e.g., in conditions such as arthritis, localized infections, or injuries) in IR (e.g., thermal) images.
  • regions with good blood circulation may appear warmer, while areas with poor circulation may appear cooler in IR (e.g., thermal) images.
  • certain types of tumors can show distinct IR (e.g., thermal) patterns due to their higher metabolic rates and increased blood flow compared to surrounding tissues.
  • the spectral characteristics may include NIR characteristics of the subject’s body region.
  • the NIR characteristics may include blood distribution in regions (e.g., vessels) immediately below the surface of the subject's body region.
  • System 200 may include a cooling device 220.
  • Cooling device 220 may include one or more fans (e.g., such as fans 106 of thermal device 100 described hereinabove). In operation, cooling device 220 may cool the subject’s body region. Cooling device 220 may expose the subject’s body region to thermal stress. Asymmetric recovery of the subject’s body region from the thermal stress may be indicative of physiological abnormalities in the subject’s body region.
  • System 200 may include a computing device 230 (e.g., such as computing device 900 described hereinbelow).
  • Computing device 230 may control operation of components of system 200.
  • Computing device 230 may control operation of imaging devices 210 and cooling device 220 to perform a screening procedure (e.g., such as breast screening) according to a screening protocol.
  • a screening procedure e.g., such as breast screening
  • the screening procedure may include an acclimatization phase (e.g., stage).
  • the acclimatization stage may last, for example, for one minute.
  • the acclimatization phase may allow the subject’s body to reach a thermal equilibrium with the ambient temperature.
  • computing device 230 may perform Non-Uniformity Correction (NUC) calibration of IR sensors 211 of imaging devices 210.
  • NUC calibration may ensure accurate imaging.
  • the intrinsic and extrinsic calibrations, such as lens distortion correction, and pose estimation may be performed during manufacturing process of imaging devices 210 (e.g., using algorithms like Perspective-n-Point (PnP), Random Sample Consensus (RANSAC) and/or any other suitable algorithm).
  • PnP Perspective-n-Point
  • RASAC Random Sample Consensus
  • computing device 230 may control imaging devices 210 to start acquisition (e.g., simultaneous acquisition) of data related to the subject’s body region.
  • computing device 230 repeats the NUC calibration of IR (e.g., thermal) sensors 210 of imaging devices 210 one or more times during the screening procedure.
  • IR e.g., thermal
  • the screening procedure may include a cooling phase.
  • the cooling phase may last, for example, for two minutes.
  • computing device 230 may turn on cooling device 220 to cool the subject’s body region (e.g., by an airflow) for a specified cooling time (e.g., two minutes) to expose the subject’s body region to the thermal stress.
  • a specified cooling time e.g., two minutes
  • the screening procedure may include a stabilization (e.g., recovery) phase.
  • the stabilization phase may last, for example, for four minutes.
  • the stabilization phase may allow the subject’s body to recover from the cooling applied during the cooling phase.
  • computing device 210 may generate a digital model 231 of the subject’s body region (e.g., as described below with respect to Figs. 15A and 15B).
  • Digital model 231 may include a plurality of time-sequential datasets (e.g., a video) representing the surface of the subject’s body region, the spectral (e.g., IR and optionally NIR) characteristics of the subject’s body region and/or a variation of the spectral characteristics of the subject’s body region over the screening time.
  • computing device 230 may calculate a probability 232 of the existence of the physiological abnormality such as breast cancer in the subject’s body region (e.g., as described below with respect to Fig. 16).
  • computing device 230 may detect 233 data values in digital model 231 representing the physiological abnormality such as breast cancer in the subject’s body region (e.g., as described below with respect to Fig. 16).
  • computing device 230 may calculate one or more biomarkers 234 indicative of the existence of the physiological abnormality such as breast cancer in the subject’s body region (e.g., as described below with respect to Figs. 17A- 17B, 18, 19A-19B and 20A-20B).
  • System 200 may include a display 240.
  • Computing device 240 may display, on display 240, digital model 231, detection 233 of data values of digital model 231 representing the physiological abnormality such as breast cancer in the subject’s body region (e.g., using bound boxes, colors or in any other suitable way) , probability 232 and/or values of one or more biomarkers 234 indicative of the existence or absence of the physiological abnormality such as breast cancer in the subject’s body region.
  • Digital model 231 and calculated probability 232, detection 233 and/or one or more biomarkers 234 may be used to visualize various physiological phenomena and/or give visual explanation of biomarkers. These calculated parameters may guide further diagnostic and clinical decision- making, such as detecting abnormalities (e.g., tumors, inflammation), monitoring disease progression, planning personalized treatments (e.g., surgical planning, radiotherapy targeting), assessing post-treatment recovery, adapting treatment plans, and supporting preventive screening and research.
  • abnormalities e.g., tumors, inflammation
  • planning personalized treatments e.g., surgical planning, radiotherapy targeting
  • assessing post-treatment recovery e.g., adapting treatment plans, and supporting preventive screening and research.
  • computing device 230 may generate an alert notification, e.g. if one or more of the calculated parameters exceeds a threshold.
  • Computing device 230 may in communication with (e.g., integrated with) a hospital information system 80.
  • Computing device 230 may transmit digital model 231, probability 232, detection 233 and/or one or more biomarkers 234 calculated for the subject and/or the alert notification to hospital information system 80.
  • hospital information system 80 may update subject’s records, trigger alerts and/or transmit notifications concerning the calculated probability 232, detection 233 and/or one or more biomarkers 234 to medical staff. This may ensure timely follow-up and intervention in the subject’s condition by the medical staff.
  • Computing device 230 may be in communication with (e.g., integrated with) an electronic health record system 82.
  • Computing device 230 may transmit to electronic health record system 82 digital model 231, probability 232, detection 233 and/or one or more biomarkers 234 calculated for the subject.
  • electronic health record system 82 may dynamically update treatment plans and/or track subject’s treatment progress. This may aid in the adjustment of ongoing treatments based on real-time data generated by computing device 230.
  • computing device 230 may identify high risk subjects for preventive screening.
  • Computing device 230 may be in communication with (e.g., integrated with) a health management system 83.
  • Computing device 230 may transmit to health management system 83 digital model 231, probability 232, detection 233 and/or one or more biomarkers 234 calculated for the subject.
  • health management system 83 may schedule preventive screenings and follow-ups for at-risk subjects, thereby, for example, improving early detection and preventive care.
  • Computing device 230 may be in communication with (e.g., integrated with) a telemedicine system 85.
  • Computing device 230 may transmit to telemedicine system 85 digital model 231, probability 232, detection 233 and/or one or more biomarkers 234 calculated for the subject.
  • Telemedicine system 85 may allow specialists to review the received data remotely, provide expert opinions and/or second opinions without the need for the subject’s appointment.
  • Computing device 230 may be in communication with (e.g., integrated with) a medical insurance and billing system 86.
  • Computing device 230 may transmit to medical insurance and billing system 86 digital model 231, probability 232, detection 233 and/or one or more biomarkers 234 calculated for the subject. Based on the received data, medical insurance and billing system 86 may streamline the approval and reimbursement process, reducing administrative burden and improving efficiency.
  • Computing device 230 may be in communication with (e.g., integrated with) a hospital management system 88.
  • Computing device 230 may transmit to hospital management system 88 probability 232, detection 233 and/or one or more biomarkers 234 calculated for the subject.
  • hospital management system 88 may forecast patient needs and optimize resource allocation, predict demand for resources such as hospital beds, surgical units, and staff, enabling better planning and resource management.
  • Computing device 230 may be in communication with (e.g., integrated with) a clinical decision support system 90.
  • Computing device 230 may transmit to clinical decision support system 90 digital model 231, probability 232, detection 233 and/or one or more biomarkers 234 calculated for the subject.
  • clinical decision support system 90 may enhance diagnostic accuracy and treatment decisions, offering evidence-based recommendations and alerts based on the latest subject’s data.
  • Computing device 230 may be in communication with (e.g., integrated with) a medical training and simulation system 91.
  • Computing device 230 may transmit to medical training and simulation system 91 digital model 231, probability 232, detection 233 and/or one or more biomarkers 234 calculated for the subject.
  • medical training and simulation system 91 may provide realistic scenarios for training healthcare professionals, improving their diagnostic and treatment skills.
  • FIG. 15 A shows a computing device 301 for generating a digital model (such as digital model 231) representing the subject’s body region based on data from imaging devices 210 of system 200, according to some embodiments of the invention.
  • Computing device 301 may be computing device 230 of system 200 and/or any other suitable computing device.
  • Computing device 301 may include a processor (such as processor 905 described hereinbelow) that may perform the operations described hereinbelow.
  • Computing device 301 may receive a plurality of time-sequential sensor datasets 310.
  • Each of sensor datasets 310 may include a plurality of data values representing the subject’s body region such as subject’s breast at a certain timestep in the screening procedure.
  • the plurality of time-sequential sensor datasets 310 may be received from sensors of imaging devices 210 of system 200.
  • a plurality of time-sequential IR (e.g., thermal) datasets e.g., IR images
  • a plurality of time-sequential distance datasets e.g., IR images
  • a plurality of time-sequential NIR datasets e.g., NIR images
  • a plurality of time-sequential RGB datasets e.g., RGB images
  • Computing device 301 may register (e.g., as indicated in operation 315 in Fig. 15A) the plurality of time-sequential sensor datasets 310 in a reference coordinate system.
  • the registration may include integration of data from sensors (e.g., such as IR (e.g., thermal) sensors 211, distance sensors 212, NIR sensors 213 and/or RGB sensors 214) of imaging devices 210 to create the representation of the subject’s body region.
  • the registration may be at least partly based on calibration data of sensors of imaging devices 210.
  • the registration may include spatial alignment in the reference coordinate system of corresponding datasets of the plurality of time-sequential IR datasets, the plurality of time- sequential distance datasets, the plurality of time-sequential NIR datasets and/or the plurality of time-sequential RGB datasets received from the sensors of imaging devices 210.
  • Fast Global Registration algorithm and/or any other suitable algorithm may be used to spatially align data values (e.g., cloud points) of the plurality of time-sequential sensor datasets.
  • Each of datasets 322 of 3D digital model 320 may include a plurality of data values representing in 3D the surface and spectral (e.g., IR (e.g., thermal) and/or NIR) characteristics of the subject’s body region at a certain timestep in the screening time.
  • computing device 301 may form a continuous 3D surface representing the subject’s body region such as the subject’s breast (e.g., by applying Poisson Surface Reconstruction algorithm and/or any other suitable algorithm to 3D digital model 320).
  • computing device 301 may generate a 2D digital model 330.
  • 2D digital model 330 may include a plurality of time-sequential datasets (e.g., images) 332 (e.g., video) representing in 2D the surface of the subject’s body region, the spectral (e.g., IR (e.g., thermal)and/or NIR) characteristics of the subject’s body region and the variation of the spectral characteristics of the subject’s body region over the screening time.
  • time-sequential datasets e.g., images
  • spectral e.g., IR (e.g., thermal)and/or NIR
  • data values of the plurality of time-sequential datasets 322 of 3D digital model 320 may be modified to represent the surface of the subject’s body region and the spectral (e.g., IR and/or NIR) characteristics of the subject’s body region in 2D to generate 2D digital model 330.
  • the modification may include parametrization such as harmonic parametrization, As-Rigid-As- Possible (ARAP) parametrization in 2D and/or any other suitable parametrization method to ensure minimal distortion in final 2D datasets (e.g., images).
  • harmonic parametrization such as harmonic parametrization, As-Rigid-As- Possible (ARAP) parametrization in 2D and/or any other suitable parametrization method to ensure minimal distortion in final 2D datasets (e.g., images).
  • ARAP As-Rigid-As- Possible
  • Each of datasets 332 of 2D digital model 330 may include a plurality of data values representing in 2D the surface and spectral (e.g., IR (e.g., thermal) and/or NIR) characteristics of the subject’s body region at a certain timestep in the screening time.
  • spectral e.g., IR (e.g., thermal) and/or NIR
  • 2D digital model 330 and/or the 3D digital model 320 may be used, e.g. by computing device 230 of system 200, to calculate the probability of and/or biomarkers indicative of the existence or absence of the physiological abnormality in the subject’s body region and/or detect data values representing the physiological abnormality such as breast cancer in the subject’s body region.
  • computing device 230 of system 200 may calculate the probability of and/or biomarkers indicative of the existence or absence of the physiological abnormality in the subject’s body region and/or detect data values representing the physiological abnormality such as breast cancer in the subject’s body region.
  • utilizing datasets 332 of the 2D digital model 330 may require significantly smaller training datasets for training machine learning models as compared to using datasets 322 of 3D digital model 320.
  • Each of datasets 332 may include a plurality of data values (e.g., pixels) representing in 2D the surface of the subject’s body region and the spectral (e.g., IR (e.g., thermal)and/or NIR) characteristics of the subject’s body region such as the subject’s breast at a certain timestep in the screening time.
  • a plurality of data values e.g., pixels
  • the spectral e.g., IR (e.g., thermal)and/or NIR
  • each of datasets 332 of 2D digital model 330 includes a IR image (e.g., thermal image) 332a representing the surface and IR characteristics (e.g., temperature values) of the subject’s body region and a NIR image 332b representing the surface and NIR characteristics (e.g., regions reach with blood such as vessels) of the subject’s body region such as the subject’s breast.
  • IR image e.g., thermal image
  • NIR image 332b representing the surface and NIR characteristics (e.g., regions reach with blood such as vessels) of the subject’s body region such as the subject’s breast.
  • each of datasets 332 of 2D digital model 330 may include IR (e.g., thermal) image 332a only.
  • each of datasets 332 of 2D digital model 330 may include an RGB image (as described hereinabove).
  • Images 332a, 332b of all datasets 332 of 2D digital model 330 may be spatially registered in the reference coordinate system (e.g., as described hereinabove).
  • machine learning techniques may be combined to calculate spatial features (e.g., spatial characteristics) and/or temporal features (e.g., characteristics) of the physiological abnormality (e.g., as described below with respect to Fig. 16).
  • FIG. 16 shows a computing device 401 for calculating a probability of the existence or absence of the physiological abnormality and/or detecting data values of the digital model representing the physiological abnormality in the subject’s body region, according to some embodiments of the invention.
  • Computing device 401 may be computing device 230 of system 200 or any other suitable computing device.
  • Computing device 401 may include a processor (such as processor 905 described hereinbelow) that may perform the operations described hereinbelow.
  • the calculation of the probability of the existence of the physiological abnormality and the detection of the physiological abnormality such as breast cancer in the subject’s body region may be performed based on a digital model 410 of the subject’s body region, such as 2D digital model 330 described above with respect to Figs. 15A-15B.
  • Digital model 410 e.g., video
  • Datasets 412 may represent the surface of the subject’s body region, the spectral (e.g., IR (e.g., thermal) and/or NIR) characteristics of the subject’s body region and the variation of the spectral characteristics of the subject’s body region over the screening time.
  • each of datasets 412 may include a IR image (e.g., thermal image) whose data values (e.g., pixels) represent the surface and IR characteristics of the subject’s body region (e.g., such as IR image 332a described above with respect to Fig.
  • Each of datasets 312 may include a plurality of data values representing the surface of the subject’s body region and the spectral (e.g., IR (e.g., thermal) and/or NIR) characteristics of the subject’s body region at a certain timestep in the screening time.
  • IR e.g., thermal
  • computing device 401 may calculate a plurality of spatial features vectors 422.
  • Each of spatial features vectors 422 may include a plurality of values (e.g., spatial features) indicative of a presence or absence of data values representing the physiological abnormality in a corresponding dataset of datasets 412 of digital model 410.
  • computing device 401 may sequentially, one at a time, provide (e.g., feed) time-sequential datasets 412 as an input to a first machine learning model 420.
  • First machine learning model 420 may include a neural network, such as a convolutional neural network (CNN).
  • CNN convolutional neural network
  • first machine learning model 420 may calculate a first spatial features vector 422a based on a first dataset 412a, a second spatial features vector 422b based on a second dataset 412b, and the n-th spatial features vector 422n based on the n-th dataset 412n (e.g., as schematically shown in Fig. 16).
  • each of spatial features vectors 422 may represent a certain timestep in the screening time.
  • Each of spatial features vectors 422 may be an embedded features vector (e.g., one-dimensional (ID) embedded features vector) generated by first machine learning model 420 through the processing of the corresponding dataset of datasets 412 (e.g., as described hereinbelow).
  • ID one-dimensional
  • first machine learning model 420 may include or variations like Residual Networks (ResNets), Dense Convolutional Networks (DenseNets), Inception Networks (GoogLeNet), U-Nets, VGG Networks, Capsule Networks (CapsNets), as well as attention-based networks or hybrid models combining CNNs with other techniques such as Graph Convolutional Networks (GCNs) or Transformer-based models adapted for image analysis.
  • Residual Networks Residual Networks
  • DenseNets Dense Convolutional Networks
  • Inception Networks GoogLeNet
  • U-Nets U-Nets
  • VGG Networks VGG Networks
  • Capsule Networks Capsule Networks
  • attention-based networks or hybrid models combining CNNs with other techniques such as Graph Convolutional Networks (GCNs) or Transformer-based models adapted for image analysis.
  • GCNs Graph Convolutional Networks
  • Temporal features vector 432 may include a plurality of values (e.g., temporal features) indicative of a variation or a temporal dependency of the values (e.g., spatial features) of spatial features vectors 422 over the screening time. Such variation or temporal dependency may be indicative of the presence or absence of the physiological abnormality such as breast cancer.
  • computing device 401 may sequentially, one at a time, provide spatial features vectors 422 as an input to a second machine learning model 430.
  • Second machine learning model may be a neural network, such a Long Short-Term Memory (LSTM) network.
  • LSTM Long Short-Term Memory
  • Temporal features vectors 432 may be a final hidden state vector generated by an LSTM layer of second machine learning model 430 at the final timestep. Based on temporal features vector 432, second machine learning model 430 may calculate a probability 434 of the existence of the physiological abnormality, such as breast cancer, in the subject’s body region. Second machine learning model 430 may provide probability 434 as an output. Other examples of second machine learning model 430 may include a Gated Recurrent Unit (GRU) network, a Transformer network, a Recurrent Neural Network (RNN), a Temporal Convolutional Network (TCN), or an Echo State Network (ESN).
  • GRU Gated Recurrent Unit
  • RNN Recurrent Neural Network
  • TCN Temporal Convolutional Network
  • ESN Echo State Network
  • Temporal features vectors 432 may be a final hidden state vector generated by LSTM, GRU, RNN, or ESN models at the final timestep, or a context vector produced by attention mechanisms in Transformer or TCN models.
  • computing device 401 may detect data values in digital model 410 representing the physiological abnormality and/or calculate the probability that the detected values represent the physiological abnormality such as breast cancer in the subject’s body region.
  • computing device 401 may provide one of datasets 412 of digital model 410, for example last dataset 412n, as an input to a third machine learning model 440.
  • Third machine learning model may be a neural network, such as a Mask-R-CNN.
  • Third machine learning model 440 may process the input dataset (e.g., dataset 41 On) through layers of third machine learning model 440 until an embedded features map 441 is generated.
  • Computing device 401 may then combine embedded features map 441 with temporal features vector 432 (e.g., the output from second machine learning model 430) and optionally with a personal information embedded vector 452 (e.g., generated from subject’s personal information 450) to provide a combined embedded features map 442.
  • Combined embedded features map 442 may be then processed through remaining layers of third machine learning model 440 to detect 443 data values representing the physiological abnormality such as breast cancer in the input dataset (e.g., last dataset 412n) and/or calculate a probability 444 that the detected values represent the physiological abnormality such as breast cancer.
  • Detection 443 may be in the form of one or more bond boxes, with each box having the probability of abnormality (or some biomarker value) and data value (e.g., pixel) level segmentation of the abnormal area.
  • Third machine learning model 440 may provide detection 443 and probability value 444 as an output 445.
  • Computing device 401 may generate personal information embedded vector 452 based on subject’s personal information 450 using encoding, one-hot encoding, natural language processing techniques and/or any other suitable technique.
  • Other examples of third machine learning model 440 may include Faster R-CNN, YOLO or RetinaNet, U-Net and FCN.
  • the example of Fig. 16 may combine machine learning techniques to leverage spatial characteristics (e.g., calculated by first machine learning model 420) and temporal characteristics (e.g., calculated by second machine learning model 430) of physiological abnormalities for enhanced detection and probability calculations (e.g., by third machine learning model 440) of the existence of the physiological abnormalities such as breast cancer in the subject’s body region.
  • spatial characteristics e.g., calculated by first machine learning model 420
  • temporal characteristics e.g., calculated by second machine learning model 430
  • second machine learning model 430 e.g., calculated by second machine learning model 430
  • third machine learning model 440 e.g., by third machine learning model 440
  • Training of first machine learning model 420, second machine learning model 430 and third machine learning model 440 may be performed by a computing device (e.g., other than computing device 401). Training of first machine learning model 420, second machine learning model 430 and third machine learning model 440 may be performed in stages. Initially, each of first machine learning model 420, second machine learning model 430 and third machine learning model 440 may be trained separately to perform its specific task. During this stage, the parameters (e.g., weights) of the previously trained model may be kept constant (e.g., frozen). Subsequently, the full combination of first machine learning model 420, second machine learning model 430 and third machine learning model 440 may be fine-tuned together using a smaller learning rate to optimize their joint performance.
  • a computing device e.g., other than computing device 401
  • Training of first machine learning model 420, second machine learning model 430 and third machine learning model 440 may be performed in stages. Initially, each of first machine learning model 420, second machine learning model 430
  • First machine learning model 420 may be trained to calculate a probability of a presence or absence of data values representing the physiological abnormality such as breast cancer in an input dataset.
  • First machine learning model 420 may include an input layer, a plurality of convolutional and pooling layers, one or more fully connected layers, and an output layer.
  • the input layer may receive an input dataset (e.g., such as datasets 412 of digital model 410).
  • the convolutional layers may apply convolution operations to the input dataset using learnable filters or kernels to capture different local patterns in the input dataset (e.g., such as edges, textures, or shapes) and generate feature maps that may highlight the presence of these patterns at different spatial locations in the input dataset.
  • the pooling layers may down sample the feature maps generated by the convolutional layers, reducing spatial dimensions of the feature maps while retaining the most important information. For example, max pooling may take the maximum value from each region of each of the feature maps effectively reducing its size.
  • the feature maps While propagating through the plurality of convolutional and pooling layers, the feature maps may be flattened into an embedded feature vector (e.g., ID embedded feature vector) which may include the spatial information of the feature maps in a one-dimensional representation.
  • the embedded feature vector may be then fed into the one or more fully connected layers which may perform classification.
  • the final of the one or more fully connected layers may be the output layer which may output the probability of the presence or the absence of data values representing the physiological abnormality such as breast cancer in the input dataset or the probability thereof.
  • first machine learning model 420 is originally trained to predict the probability of the presence or absence of data values representing the physiological abnormality, such as breast cancer, it does not output these probabilities in system of Fig. 16. Instead, the embedded feature vectors generated by first machine learning model 420 through the processing of each of datasets 412 of digital model 410 are used as the corresponding spatial features vectors of spatial features vectors 422 (e.g., as described hereinabove).
  • Training data for training first machine learning model 420 may include a plurality of training input datasets (e.g., such as datasets 412) each labeled with a correct output indicating the presence or the absence of data values representing the physiological abnormality in the respective training input dataset.
  • each of the training input datasets may include a IR (e.g., thermal)image such as IR image 332a and optionally an NIR image such as NIR image 332b described above with respect to Fig. 15B labeled with the correct output indicating the presence or the absence of pixels representing the physiological abnormality (e.g., such as bounding boxes, segmentation masks, or true/false labels for specific regions) in the respective training input dataset.
  • weights of first machine learning model 420 may be set randomly.
  • a machine learning model pretrained to perform classification tasks on separate datasets such as ImageNet, may have its weights used as a starting point for training the first machine learning model 420.
  • the training input datasets may be then sequentially, one at a time, fed into first machine learning model 420 which may calculate the predicted output.
  • the calculated predicted output may be compared to the labeled correct output and a loss may be calculated using a loss function.
  • the training input dataset may be then backpropagated through first machine learning model 420 to calculate gradients of the loss using an optimization algorithm and update the weights of first machine learning model 420.
  • the training process may be repeated a plurality of times, each time with a different training input dataset of the plurality of training input datasets, for example until first machine learning model 420 converges (e.g., until the loss stabilizes and/or the validation performance stops improving).
  • the training data may include validation datasets that may be used for validation performance of first machine learning model 420.
  • the training data may include a test dataset for the final evaluation first machine learning model 420.
  • Second machine learning model 430 e.g., LSTM
  • Second machine learning model 430 may be trained to calculate temporal dependencies of spatial features of spatial features vectors 422 that may represent the presence or absence of the physiological abnormality such as breast cancer.
  • Second machine learning model 430 may include an input layer, an LSTM layer and a dense layer.
  • the input layer may receive a sequence of spatial features vectors 422 (e.g., generated by first machine learning model 420), wherein each of spatial features vectors 422 may represent a timestep in the sequence.
  • the LSTM layer may process spatial features vectors 422 one at a time, updating an internal hidden state vector and a cell state vector of the LSTM layer at each timestep.
  • the final hidden state vector after processing the entire sequence (e.g., such as temporal features vector 432 described hereinabove) may include a plurality of values (e.g., temporal features) representing the summary representation of the sequence of spatial features vectors 422.
  • the final hidden state vector may be then fed into the dense layer.
  • the dense layer may map, e.g. using sigmoid activation function, the final hidden state vector to the probability of the presence or the absence of spatial features representing the physiological abnormality such as breast cancer in spatial features vectors 422.
  • Training data for training second machine learning model 430 may include a plurality of training datasets each including a sequence of spatial features vectors (such as spatial features vectors 422) labeled with a correct output indicating the presence or the absence of spatial features representing the physiological abnormality such as breast cancer in the respective sequence.
  • training datasets may be fed into second machine learning model 430 (e.g., LSTM).
  • Second machine learning 430 may process sequentially, one at a time, the spatial features vectors of each training dataset and calculate the predicted output. The calculated predicted output may be compared to the labeled correct output and a loss may be calculated using a loss function.
  • the training dataset may be then backpropagated through second machine learning model 430 to calculate gradients of the loss using an optimization algorithm and update the weights of second machine learning model 430.
  • the training process may be repeated a plurality of times, each time with a different training dataset of the plurality of training datasets, for example until second machine learning model 430 converges (e.g., until the loss stabilizes and/or the validation performance stops improving).
  • the training data may include validation datasets that may be used for validation performance of second machine learning model 430.
  • the training data may include a test dataset for the final evaluation second machine learning model 430.
  • Third machine learning model 440 may be trained to detect data values representing the physiological abnormality such as breast cancer in an input dataset (e.g., such as last dataset 412n of digital model 410) and/or to calculate the probability that the detected values represent the physiological abnormality such as breast cancer.
  • Third machine learning model 440 may include an input layer which may receive the input dataset, a backbone network (e.g., such as Residual Network (ResNet)) for feature extraction from the input dataset, a region proposal network (RPN) for generating region proposals, Region of Interest (ROI) align layers for aligning the region proposals to a uniform size, and detection heads layers for classification, bounding box regression, and mask prediction.
  • a backbone network e.g., such as Residual Network (ResNet)
  • RPN region proposal network
  • ROI Region of Interest
  • an embedded features map 441 may be generated.
  • the embedded features map 441 may be combined with temporal features vector 432 and optionally with personal information embedded vector 452 to provide combined embedded features map 442.
  • the combined embedded features map 442 may be fed into the layers of RPN and subsequently to the ROI align layers and subsequently processed through the detection head layers for classification, bounding box regression, and mask prediction heads and output the final detection results, including bounding boxes, probabilities, and segmentation masks of data values representing the abnormal regions in the input dataset (e.g., such as last dataset 412n of digital model 410).
  • Training data for training third machine learning model 440 may include a plurality of training input datasets (e.g., annotated images), each labeled with bounding boxes, class labels, and segmentation masks indicating data values (e.g., pixels) in the respective training dataset representing regions of physiological abnormalities such as breast cancer.
  • each of the training input datasets may include a IR (e.g., thermal)image such as IR image 332a and optionally an NIR image such as NIR image 332b described above with respect to Fig. 15B labeled with bounding boxes, class labels, and segmentation masks indicating pixels in the respective training dataset representing regions of physiological abnormalities.
  • IR e.g., thermal
  • the training input datasets may be sequentially, one at a time, fed into third machine learning model 440, which may extract features, propose regions, and generate predictions for bounding boxes, class labels, and masks.
  • the predicted outputs may be compared to the correct output (e.g., the ground truth annotations described above) to calculate a loss using a multi-task loss function.
  • the training dataset may then be backpropagated through third machine learning model 440 to calculate gradients of the loss using an optimization algorithm and update the weights of third machine learning model 440.
  • the training process may be repeated a plurality of times, each time with a different training input dataset of the plurality of training input datasets, for example until third machine learning model 440 converges (e.g., until the loss stabilizes and/or the validation performance stops improving).
  • the training data may include validation datasets that may be used for validation performance of third machine learning model 440.
  • the training data may include a test dataset for the final evaluation of third machine learning model 440.
  • This integrated system may then be fine-tuned to optimize the overall performance, ensuring that the spatial features, temporal dependencies, and region-specific information are cohesively utilized for accurate detection and probability calculation of physiological abnormalities such as breast cancer.
  • This combined model may be retrained using a smaller learning rate to fine-tune the joint performance of the integrated architecture, ensuring that all components work together seamlessly.
  • At least one dataset of the plurality of time-sequential datasets of the digital model may be selected. Based on the at least one dataset or a derivative of the at least one dataset, a first sub dataset representing a first subregion such as the right breast and a second sub dataset representing a second subregion such as the left breast of the subject’s body region may be detected. Based on the first sub dataset and the second sub dataset, a first set of spatial features and a second set of spatial features, respectively, may be calculated.
  • the biomarker indicative of the existence or absence of the physiological abnormality in the subject’s body region may be calculated.
  • Comparing the first subregion and the second subregion of the subject’s body region may reduce false positive determinations by ensuring that detected abnormalities are not just variations of normal anatomy but are indeed significant differences indicative of potential physiological abnormalities such as breast cancer. For example, computing the mean temperature on an asymmetric dataset (e.g., image) without separating the dataset into the first subregion and the second subregion may yield a result close to zero, even if there is high asymmetry in the dataset (e.g., image).
  • an asymmetric dataset e.g., image
  • biomarkers indicative of the existence or absence of the physiological abnormality in the subject’s body region may include a thermal asymmetry index (TAI), a thermal entropy score (TES), a vascular activity indicator (VAI) and a dynamics anomaly indicator (DAS), as described below with respect to Figs. 17A- 17B, 18, 19A-19B, 20A-20B and 21A-21E.
  • TAI thermal asymmetry index
  • TES thermal entropy score
  • VAI vascular activity indicator
  • DAS dynamics anomaly indicator
  • Fig. 17A which a computing device for calculating a thermal asymmetry index (TAI) biomarker 532 indicative of the existence or absence of the physiological abnormality (such as breast cancer) in the subject’s body region (such as breast), according to some embodiments of the invention.
  • TAI thermal asymmetry index
  • Computing device 501 may be computing device 230 of system 200 or any other suitable computing device.
  • Computing device 501 may include a processor (such as processor 905 described hereinbelow) that may perform operations described hereinbelow.
  • Thermal asymmetry index (TAI) biomarker 532 may be calculated based on a dataset 510 such as one of datasets 332 of 2D digital model 330.
  • dataset 510 may be a dataset of datasets 332 of 2D digital model 330 acquired at the end of acclimatization phase and before the cooling phase of the screening procedure (e.g., as described above with respect to Fig. 14).
  • Dataset 510 may include a plurality of data values representing the surface and spectral characteristics of the subject’s body region such as the subject’s breast.
  • dataset 510 may include only a IR (e.g., thermal) dataset or image (such as IR dataset 332a described above with respect to Fig. 15B) representing the IR (e.g., thermal) characteristics of the subject’s body region.
  • dataset 510 may also include NIR dataset and/or RGB dataset as described hereinabove.
  • computing device 501 may calculate a mirrored dataset 512.
  • Mirrored dataset 512 may provide a mirrored representation of dataset 510. For example, data values that appear in the first column in dataset 510 appear in the last column in mirrored dataset 512, data values that appear in the second column in dataset 510 appear one column before the last column in mirrored dataset and so on.
  • Computing device 501 may register dataset 510 and mirrored dataset 512 with respect to each other.
  • the registration may include spatial alignment of dataset 510 and mirrored dataset 512 with respect to each other.
  • the registration may be performed using, for example, a non-rigid registration, a b-spline non-linear alignment and/or any other suitable alignment or registration technique.
  • the registration may be at least partly based on segmentation data indicating the location of the right and left breasts and the location of the right and left nipples.
  • computing device 501 may calculate a difference dataset 514.
  • Difference dataset 514 may include a plurality of data values representing a difference between the dataset 510 and mirrored dataset 512.
  • mirrored dataset 512 may be subtracted from dataset 510.
  • hotspots and/or vessels common to both right and left breasts may be effectively removed, focusing difference dataset 514 solely on the discrepancies between the right left breasts and highlighting asymmetry between the right and left breasts.
  • difference dataset 514 calculated for the subject’s breast is described below with respect to Fig. 17B.
  • computing device 501 may detect a first sub dataset 514a and a second sub dataset 514b.
  • First sub dataset 514a may include a first subset of the data values of difference dataset 514 that represent a first subregion of the subject’s body region (e.g., the right breast in the example of the subject’s breast).
  • Second sub dataset 514b may include a subset of the data values of difference dataset 514 that represent a second subregion of the subject’s body region (e.g., the left breast in the example of the subject’s breast).
  • computing device 501 may calculate a first set of spatial features 522.
  • first set of spatial features 522 may include a first embedded features vector 522a and/or a first computer vision features vector 522b.
  • Second set of spatial features 524 may include a second embedded features vector 524a and/or a second computer vision features vector 524b.
  • computing device 501 may provide the respective first sub dataset 514a or second sub dataset 514b as an input to a machine learning model 520.
  • Machine learning model 520 may be a CNN (e.g., such as Residual Network (ResNET)).
  • Machine learning model 520 may be trained, for example like first machine learning model 420 as described above with respect to Fig. 16, to calculate a probability of a presence or absence of data values representing the physiological abnormality such as breast cancer in an input dataset (e.g., such as first or second sub dataset 514a, 514b).
  • Training data for training machine learning model 520 may include a plurality of training input datasets (e.g., such as dataset 510) each labeled with a correct output indicating the presence or the absence of data values representing the physiological abnormality in the respective training input dataset.
  • Each of first embedded features vector 522a and second embedded features vector 524a may include a plurality of values (e.g., spatial features) indicative of the presence or the absence of data values representing the physiological abnormality such as breast cancer in the respective first sub dataset 514a or second sub dataset 514b.
  • Each of first embedded features vector 522a and second embedded features vectors 524a may be an embedded features vector (e.g., ID embedded features vector such as spatial features vectors 422 described above with respect to Fig. 16) generated by machine learning model 520 through the processing of the respective first sub dataset 512a or second sub dataset 514b (e.g., as described hereinbelow).
  • Each of first computer vision features vector 522b and second computer vision features vector 524b may include a plurality of values such as mean, variance, skewness, contrast, homogeneity, entropy and/or any other suitable parameter of data values of the respective first sub dataset 514a and second sub dataset 514b.
  • computing device 501 may calculate TAI biomarker 532.
  • computing device 501 may provide first set of spatial features 522 and second set of spatial features 524 as an input to a machine learning model 530.
  • Machine learning model 530 may be a Support Vector Machine (SVM) model.
  • SVM Support Vector Machine
  • machine learning model 530 (e.g., SVM) may calculate TAI biomarker 532 and provide TAI biomarker 532 as an output.
  • Other examples of machine learning model 530 may include a Random Forest, Gradient Boosting Machine (GBM), Artificial Neural Network (ANN), K-Nearest Neighbors (KNN), or Logistic Regression model.
  • TAI biomarker 532 may be indicative of the asymmetry (e.g., deviations in symmetry) between the first and second subregions of the subject’s body region such as the right and left breasts, which may suggest the existence of underlying physiological abnormality such as breast cancer.
  • TAI biomarker 532 may be indicative of the existence or absence of the physiological abnormality such as breast cancer in the subject’s body region such as the subject’s breast.
  • Machine learning model 530 may be trained to calculate TAI biomarker 532 based on first set of spatial features 522 (e.g., representing the first subregion of the subject’s body region such as the right breast) and second set of spatial features 524 (e.g., representing the second subregion of the subject’s body region such as the left breast).
  • Machine learning model 530 e.g., SVM
  • Machine learning model 550 e.g., SVM
  • machine learning model 550 e.g., SVM
  • machine learning model 530 may calculate TAI biomarker 532 and provide TAI biomarker 532 as an output.
  • Training data for training machine learning model 530 may include a plurality of training datasets each labeled with an indication of the presence or absence of the physiological abnormality.
  • each of the training datasets may include a first set of spatial features (e.g., such as first set of spatial features 522) representing the first subregion (e.g., such as the right breast), a second set of spatial features (e.g., such as second set of spatial features 524) representing the second region (e.g., such as the left breast) of the subject’s body region and a label indicating the presence or absence of the physiological abnormality.
  • machine learning model 530 may use the pairs of sets spatial features and their corresponding labels to learn the optimal hyperplane for classification.
  • the training process may include solving, by machine learning model 530 (e.g., SVM), an optimization problem to find the hyperplane that maximizes the margin between classes. This process may be performed once using the entire training data or may be repeated with different subsets of the training data for validation purposes. The training process may continue until the optimization criteria are met, ensuring that the model accurately separates the classes.
  • Fig. 17B is an illustration of difference dataset 514 for calculation of TAI biomarker 532, wherein difference dataset 514 represents the subject’s breast region, according to some embodiments of the invention.
  • difference dataset 514 includes a difference IR image representing the surface and the difference IR characteristics of the subject’s breast.
  • difference dataset 514 hotspots and/or vessels common to both right and left breasts may be effectively removed, focusing difference dataset 514 solely on the discrepancies between the right left breasts and highlighting asymmetry between the right and left breasts.
  • areas that appear hotter may indicate corresponding cooler regions on the opposite breast, and vice versa, areas that appear cooler in difference dataset 514 may indicate corresponding hotter regions on the opposite breast.
  • Fig. 18 shows a computing device 601 for calculating a thermal entropy score (TES) biomarker 632 indicative of the existence or absence of the physiological abnormality (such as breast cancer) in the subject’s body region (such as breast), according to some embodiments of the invention.
  • TES thermal entropy score
  • Computing device 601 may be computing device 230 of system 200 or any other suitable computing device.
  • Computing device 601 may include a processor (such as processor 905 described hereinbelow) that may perform operations described hereinbelow.
  • Thermal entropy score (TES) biomarker 632 may be calculated based on a dataset 610 such as one of datasets 332 of 2D digital model 330.
  • dataset 610 may be a dataset of datasets 332 of 2D digital model 330 acquired at the end of acclimatization phase and before the cooling phase of the screening procedure (e.g., as described above with respect to Fig. 14).
  • Dataset 610 may include data values representing the surface and spectral characteristics of the subject’s body region such as the subject’s breast.
  • dataset 610 may include only a IR dataset (e.g., thermal dataset) or IR image (such as IR dataset 332a described above with respect to Fig. 15B) representing the IR characteristics of the subject’s body region.
  • dataset 610 may also include NIR dataset and/or RGB dataset as described hereinabove.
  • computing device 601 may detect a first sub dataset 612 and a second sub dataset 614.
  • First sub dataset 612 may include a first subset of data values of dataset 610 that represent a first subregion of the subject’s body region (e.g., the right breast in the example of the subject’s breast).
  • Second sub dataset 614 may include a second subset of data values of dataset 610 that represent a second subregion of the subject’s body region (e.g., the left breast in the example of the subject’s breast).
  • computing device 601 may detect data values 612a representing hotspots in the first subregion of the subject’s body region such as the right breast. Based on second sub dataset 615, computing device 601 may detect data values 614a representing hotspots in the second subregion of the subject’s body region such as the left breast. Data values 612a, 614a representing the hotspots may be detected based on temperature information inherently represented by the data values of dataset 610 (which may be an IR dataset or IR image as described hereinabove).
  • computing device 601 may calculate a first mean temperature of the first subregion based on the data values of first sub dataset 612 and identify the data vales whose temperature is greater than the first mean temperature by more than a threshold as data values 612a representing the hotspots in the first subregion.
  • computing device 601 may calculate a second mean temperature of the second subregion based on the data values of second sub dataset 614 and identify the data vales whose temperature is greater than the second mean temperature by more than the threshold as data values 614a representing the hotspots in the second subregion.
  • computing device 601 may calculate a first set of spatial features 622.
  • first set of spatial features 622 may include a first embedded features vector 622a, a first computer vision features vector 622b and/or a first hotspots features vector 622c.
  • Second set of spatial features 624 may include a second embedded features vector 624a, a second computer vision features vector 624b and/or a second hotspots features vector 622c.
  • computing device 601 may provide the respective first sub dataset 612 or second sub dataset 614 as an input to a machine learning model 620 (e.g., such as machine learning model 520 described above with respect to Fig. 17A).
  • a machine learning model 620 e.g., such as machine learning model 520 described above with respect to Fig. 17A.
  • Each of first computer vision features vector 622b and second computer vision features vector 624b may include a plurality of values such as mean, variance, skewness, contrast, homogeneity, entropy and/or any other suitable parameter of the data values of the respective first sub dataset 612 and second sub dataset 614. These values may be calculated by computing device 601 based on first sub dataset 612 and second sub dataset 614.
  • Each of first hotspots features vector 622c and second hotspots features vector 624c may include a plurality of values (e.g., features) such as number of hotspots, mean size of hotspots, mean measure of deviation of hotspots’ shape from their best- fit ellipse shape, mean temperature difference between the hotspots and the hotspots’ surrounding.
  • values or features may be calculated by computing device 601 based on data values 612a and data values 614a representing the hotspots of the first subregion and the second subregion of the subject’s body, respectively.
  • computing device 601 may calculate TES biomarker 632.
  • computing device 601 may provide first set of spatial features 622 and second set of spatial features 624 as an input to a machine learning model 630 (e.g., SVM model such as machine learning model 530 described above with respect to Fig. 17A).
  • machine learning model 630 may calculate TES biomarker 632 and provide TES biomarker 632 as an output.
  • Machine learning model 630 may operate and may be trained such as machine learning model 530 described above with respect to Fig. 17A.
  • Training data for machine learning model 630 may include a plurality of training datasets each labeled with an indication of the presence or absence of the physiological abnormality.
  • each of the input datasets may include a first set of spatial features (e.g., such as first set of spatial features 622) representing the first subregion (e.g., such as the right breast), a second set of spatial features (e.g., such as second set of spatial features 624) representing the second region (e.g., such as the left breast) of the subject’s body region and a label indicating the presence or absence of the physiological abnormality.
  • TES biomarker 632 may be indicative of irregular thermal patterns which may indicate abnormal tissue behavior or metabolic activity in the subject’s body region, which in turn may suggest the existence of underlying physiological abnormality such as breast cancer.
  • Fig. 19A is a computing device 701 for calculating a vascular activity indicator (VAI) biomarker 732 indicative of the existence or absence of the physiological abnormality (such as breast cancer) in the subject’s body region (such as breast), according to some embodiments of the invention.
  • VAI vascular activity indicator
  • Computing device 701 may be computing device 230 of system 200 or any other suitable computing device.
  • Computing device 701 may include a processor (such as processor 905 described hereinbelow) that may perform operations described hereinbelow.
  • Vascular activity indicator (VAI) biomarker 732 may be calculated based on a dataset 710 such as one of datasets 332 of 2D digital model 330.
  • dataset 710 may be a dataset of datasets 332 of 2D digital model 330 acquired at the end of acclimatization phase and before the cooling phase of the screening procedure (e.g., as described above with respect to Fig. 14).
  • Dataset 710 may include a plurality of data values representing the surface and spectral characteristics of the subject’s body region such as the subject’s breast.
  • dataset 710 may include a IR (e.g., thermal) dataset (such as IR dataset or image 332a) representing the IR characteristics and an NIR dataset (such as NIR dataset or image 332b) representing NIR characteristics of the subject’s body region.
  • IR e.g., thermal
  • NIR dataset such as NIR dataset or image 332b
  • dataset 710 may include IR dataset only or NIR dataset only.
  • computing device 701 may detect data values of dataset 710 representing vessels in the subject’s body region such as the subject’s breast. For example, Frangi filtering algorithm may be applied to dataset 710 to detect the data values representing the vessels. Based on the detected data values, computing device 701 may generate a vessels map dataset 712.
  • computing device 701 may detect a first sub dataset 712a and a second sub dataset 712b.
  • First sub dataset 712a may include a first subset of data values of vessels map dataset 712 that represent a first subregion of the subject’s body region (e.g., the right breast in the example of the subject’s breast).
  • Second sub dataset 712b may include a second subset of data values of vessels map dataset 712 that represent a second subregion of the subject’s body region (e.g., the left breast in the example of the subject’s breast).
  • first sub dataset 712a of vessels map dataset 712 (e.g., representing the first subregion such as the right breast)
  • computing device 701 may calculate a first set of spatial features 722.
  • second sub dataset 712b of vessels map dataset 712 (e.g., representing the second subregion such as the left breast)
  • computing device 701 may calculate a second set of spatial features 724.
  • First set of spatial features 722 may include a plurality of values (e.g., features) related to the vessels in the first subregion of the subject’s body region such as the right breast.
  • Second set of spatial features 724 may include a plurality of values (e.g., features) related to the vessels in the second subregion of the subject’s body region such as the left breast.
  • Each of first set of spatial features 722 and second set of spatial features 724 may include values (e.g., features) such as total length, perimeter, number of intersection points, fractal dimension, median tortuosity and/or any other suitable feature.
  • computing device 701 may calculate VAI biomarker 732.
  • computing device 701 may provide first set of spatial features 722 and second set of spatial features 724 as an input to a machine learning model 730 (e.g., SVM model such as machine learning model 530 described above with respect to Fig. 17A).
  • machine learning model 730 may calculate VAI biomarker 732 and provide VAI biomarker 732 as an output.
  • Machine learning model 730 may operate and may be trained such as machine learning model 530 described above with respect to Fig. 17A.
  • Training data for machine learning model 730 may include a plurality of training datasets each labeled with an indication of the presence or absence of the physiological abnormality.
  • each of the input datasets may include a first set of spatial features (e.g., such as first set of spatial features 722) representing the first subregion (e.g., such as the right breast), a second set of spatial features (e.g., such as second set of spatial features 724) representing the second region (e.g., such as the left breast) of the subject’s body region and a label indicating the presence or absence of the physiological abnormality.
  • VAI biomarker 732 may be indicative of irregular vessels network patterns which may suggest the existence of underlying physiological abnormality such as breast cancer. The higher the value of VAI biomarker 732, the greater may be the measure of irregular thermal patterns in the subject’s body region. Accordingly, VAI biomarker 732 may be indicative of the existence or absence of the physiological abnormality such as breast cancer in the subject’s body region such as the subject’s breast.
  • FIG. 19B is an illustration of vessels map dataset 712 for calculation of VAI biomarker 732, wherein vessels map dataset 712 represents the subject’s breast region, according to some embodiments of the invention.
  • Vessels map dataset 712 may be calculated based on dataset 710 such as one of datasets 332 of 2D digital model 330 described above with respect to Fig. 15B. Based on vessels map dataset 712, features such as total length, perimeter, number of intersection points, fractal dimension, median tortuosity and/or any other suitable feature related to the vascular network of the subject’s body region such as subject’s breast may be calculated.
  • Fig. 20 A which a computing device 810 for calculating a dynamic anomaly score (DAS) biomarker 832 indicative of the existence or absence of the physiological abnormality (such as breast cancer) in the subject’s body region (such as breast), according to some embodiments of the invention.
  • DAS dynamic anomaly score
  • Computing device 801 may be computing device 230 of system 200 or any other suitable computing device.
  • Computing device 801 may include a processor (such as processor 905 described hereinbelow) that may perform operations described hereinbelow.
  • DAS Dynamic anomaly score
  • DAS Dynamic anomaly score
  • Computing device 801 may identify a first dataset 812a in digital model 810 that corresponds to the end of cooling phase of the screening procedure. For example, first dataset 812a may be identified based on the timestamp associated with first dataset 812a or by identifying a dataset in digital model 810 that has the lowest mean temperature values (inherently represented by data values of thermal datasets). Computing device 801 may identify a second dataset 812b in digital model 810 that corresponds to the end of stabilization phase of the screening procedure. For example, second dataset 812b may be identified based on the timestamp associated with second dataset 812b or by identifying a dataset in digital model 810 that has the highest mean temperature values (inherently represented by data values of thermal datasets).
  • Computing device 801 may register (e.g., spatially align) first dataset 812a and second dataset 812b with respect to each other. Based on first dataset 812a and second dataset 812b, computing device 801 may calculate a difference (e.g., temperature recovery gradient (TRG)) dataset 814. Difference dataset 814 may include a plurality of data values representing a difference between the second datasets 812b and first dataset 812a. For example, in order to determine difference dataset 814, first dataset 812a may be subtracted from second dataset 812b. Difference dataset 814 may represent the difference between the final and initial temperature points, providing a direct measure of the thermal recovery rate of the subject’s body region from the thermal stress.
  • TRG temperature recovery gradient
  • first and second datasets 812a, 812b may include only a IR (e.g., thermal) dataset or image (such as IR dataset 332a described above with respect to Fig. 15B) representing the IR characteristics of the subject’s body region.
  • first and second datasets 512a, 512b may also include NIR dataset and/or RGB dataset as described hereinabove.
  • computing device 801 may detect a first sub dataset 814a and a second sub dataset 814b.
  • First sub dataset 814a may include a first subset of data values of difference dataset 814 that represent a first subregion of the subject’s body region (e.g., the right breast in the example of the subject’s breast).
  • Second sub dataset 814b may include a subset of data values of difference dataset 814 that represent a second subregion of the subject’s body region (e.g., the left breast in the example of the subject’s breast).
  • computing device 801 may calculate a first set of spatial features 822.
  • first set of spatial features 822 may include a first embedded features vector 822a and/or a first computer vision features vector 822b.
  • Second set of spatial features 824 may include a second embedded features vector 824a and/or a second computer vision features vector 824b.
  • computing device 801 may provide the respective first sub dataset 814a or second sub dataset 814b as an input to a machine learning model 820 (e.g., such as machine learning model 520 described above with respect to Fig. 17A).
  • a machine learning model 820 e.g., such as machine learning model 520 described above with respect to Fig. 17A.
  • Each of first computer vision features vector 822b and second computer vision features vector 824b may include a plurality of values such as mean, variance, skewness, contrast, homogeneity, entropy and/or any other suitable parameter of data values of the respective first sub dataset 814a and second sub dataset 814b.
  • computing device 801 may calculate DAS biomarker 832.
  • computing device 801 may provide first set of spatial features 822 and second set of spatial features 824 as an input to a machine learning model 830 (e.g., such as machine learning model 530 described above with respect to Fig. 17A).
  • machine learning model 830 e.g., SVM
  • machine learning model 830 may calculate DAS biomarker 832 and provide DAS biomarker 832 as an output.
  • Machine learning model 830 may operate and may be trained such as machine learning model 530 described above with respect to Fig. 17A.
  • Training data for machine learning model 830 may include a plurality of training datasets each labeled with an indication of the presence or absence of the physiological abnormality.
  • each of the input datasets may include a first set of spatial features (e.g., such as first set of spatial features 822) representing the first subregion (e.g., such as the right breast), a second set of spatial features (e.g., such as second set of spatial features 824) representing the second region (e.g., such as the left breast) of the subject’s body region and a label indicating the presence or absence of the physiological abnormality.
  • DAS biomarker 832 may be further calculated, e.g. by computing device 801, based on an additional dataset other than difference (TRG) dataset 814.
  • the additional dataset may include a variance dataset (e.g., indicative of the thermal fluctuation intensity throughout the recovery or stabilization phase and the uniformity of the recovery process), a skewness and kurtosis dataset (e.g., indicative of the asymmetry and peaks of the temperature evolution and the measure of even or uneven temperature rise in the stabilization phase) and/or an area under the curve (AUC) dataset (e.g., indicative of the total heat recovery over the observed period and of the measure of recovery rate).
  • AUC area under the curve
  • the additional dataset may be calculated based on all intermediate datasets between first dataset 812a and second datasets 812b of digital model 810 belonging to the stabilization (e.g., recovery) phase of the screening procedure (e.g., based on all datasets).
  • the intermediate datasets may be stabilized (e.g., spatially aligned) with respect to first dataset 812a (e.g., to isolate thermal variations from physical movements, ensuring any observable changes are due to temperature fluctuations over time). This may allow each data value (e.g., pixel) within the intermediate datasets to be transformed into a discrete time series that delineates the thermal evolution of that specific point on the skin surface.
  • the additional dataset may be then processed in the same way as difference dataset 814 to determine at least one additional first set of spatial features and at least one additional second set of spatial features, which can be fed to machine learning model 830 together with first set of spatial features 822 and second set of spatial features 824 to calculate DAS biomarker 832.
  • Computing device 801 may calculate DAS biomarker 832 based on difference dataset 814 and at least one of or all of additional datasets including the variance dataset, a skewness and kurtosis dataset and/or the AUC dataset.
  • DAS biomarker 832 may be indicative of the dynamic thermal reaction and physiological response of the subject’s body region such as the subject’s breast to the induced thermal stress.
  • thermal disparity may be observed between the malignant regions and adjacent healthy tissue characterized by distinct patterns of differential dynamic thermal reaction to the induced thermal stress.
  • Fig. 20B is an illustration of difference dataset 814 for calculation of DAS biomarker 832, wherein difference dataset 814 represents the subject’s breast region, according to some embodiments of the invention.
  • Difference dataset 814 may represent the difference between the final and initial temperature points, providing a direct measure of the thermal recovery rate of the subject’s body region such as the subject’s breast from the thermal stress.
  • each of TAI, TES, VAI and DAS biomarkers may be calculated for each of the first subregion (such as the right breast) and the second subregion (such as the right breast) of the subject’s body region (e.g., in addition to or instead of calculating the respective biomarker for the entire subject’s body region).
  • the biomarker for each of the first subregion (such as the right breast) and the second subregion (such as the right breast) of the subject’s body region may be calculated in the same way as the respective biomarker would be calculated for the entire subject’s body region, for example by calculating and providing, the first set of spatial features representing the first subregion and the second set of spatial features representing the second subregion as an input to a respective machine learning model (e.g., such as machine learning models 530, 630, 730 and 830 as described hereinabove). For example, two separate models for each biomarker may be trained.
  • One model may be trained based on training data labeled as “video, left breast with abnormality/left breast without abnormality”, and the other model may be trained based on training data labeled as “video, right breast with abnormality/right breast without abnormality”. This approach may ensure that each model focuses on the asymmetry and abnormalities specific to each of the right breast and the left, potentially improving diagnostic accuracy by providing more detailed and localized biomarker values.
  • FIGs. 21A, 21B, 21C, 21D and 21E are illustrations of results of five validation cases of calculated TAI, TES, VAI and DAS biomarkers, according to some embodiments of the invention.
  • FIG. 21A, 21B, 21C, 21D and 21E shows IR images, mammography images and ultrasound images for the respective validation case as described hereinbelow.
  • Figs. 21 A, 21B, 21 C, 21D and 21E three IR images TI(1), TI(2) and 11(3) are shown.
  • TI(1) shows baseline thermographic images taken at the start of cooling phase of the screening procedure for the respective validation case.
  • TI(2) shows the thermal behavior of breast tissue after the controlled cooling stress test for the respective validation case.
  • TI(3) shows the thermal asymmetry within the breast tissue for the respective validation case.
  • Case 1 shown in Fig. 21 A, is a 54-y ear-old postmenopausal patient who presented with a clinically suspicious lump in her left breast.
  • a mammogram of the breasts in two planes was conducted at an external radiological center. On mammogram images, the palpation finding was most likely not represented due to its peripheral location.
  • supplemental tomosynthesis on the left breast in MLO view and ultrasound were performed at the breast unit.
  • a 3.6 cm measuring spiculated lump was detected in the 11 o’clock area of the left breast in direct contact to the pectoralis major muscle.
  • the distance to the skin surface was 8 mm.
  • the distance to the nipple was > 5 cm.
  • An ultrasound guided core-cut biopsy showed an invasive hormone-receptor negative, Her2-positive breast cancer, with a Ki67 of 50%.
  • a distinct hotspot precisely where the cancer was detected, suggests notable heat retention and temperature variation post cooling stress test, diverging significantly from the surrounding tissue’s thermal behavior.
  • the calculated biomarkers for this case are TAI of 0.95, TES of 0.82, VAI of 0.96, DAS of 0.91, which reveal substantial asymmetry, unusual thermal patterns, increased vascular activity, and atypical responses to thermal stress, respectively, indicating presence of malignancy.
  • Case 2 shown in Fig. 21B, is a 52-year-old perimenopausal patient who presented with a suspicious lump in her right breast.
  • a breast augmentation for cosmetic reasons was performed in the past.
  • no suspicious lesion could be detected except for suspicious lymph nodes.
  • Ultrasound, and digital breast tomosynthesis were performed as part of her routine diagnostic work-up.
  • a 2.0 cm measuring spiculated lump was detected in the 9 o’clock area of the left breast.
  • the distance to the skin surface was 6 mm, the distance to the implant was 2 mm, the distance to the nipple was not documented.
  • An ultrasound guided core-cut biopsy showed an invasive Luminal B-like breast cancer, with a Ki67 of 30%.
  • a sonographically suspicious axillary lymph node was also biopsied and showed a lymph node metastasis of the breast cancer.
  • Thermal recovery images post-cooling stress highlight an alarming rise in temperature and significant thermal fluctuation within the area impacted by the tumor, indicators of the cancer's aggressive metabolic activity.
  • the calculated biomarkers for this case are TAI of 0.97, TES of 0.89, VAI of 0.93, DAS of 0.9, which reveal underscore marked asymmetry, distinct thermal anomalies, enhanced vascular activity, and unusual thermal responses, respectively, indicating presence of malignancy.
  • Case 3 shown in Fig. 21 C, is a 52-year-old perimenopausal patient presented with a suspicious lump in her right breast.
  • Ultrasound and digital breast tomosynthesis were performed as part of her routine diagnostic work-up.
  • a 1.4 cm measuring lump was detected in the 12 o’clock area of the right breast.
  • the distance to the skin surface was 7 mm.
  • the distance to the nipple was > 5 cm.
  • the lesion appears as a subtle architectural distortion in both modalities.
  • An ultrasound guided core-cut biopsy showed an invasive Luminal A-like breast cancer, with a Ki67 of 15%.
  • This patient's right breast shows a generalized increase in temperature, suggesting extensive malignant activity that complicates pinpointing the precise tumor location with original thermal asymmetry images alone.
  • Subsequent heat recovery imaging refines the diagnostic area to a spot just above the nipple, providing a clearer indication of malignancy.
  • This tumor is classified as pTlc, positioned at the right breast’s 12 o’clock location.
  • the calculated biomarkers for this case are TAI of 0.91, TES of 0.92, VAI of 0.89, DAS of 0.92, which reveal significant asymmetry, unusual thermal patterns, vascular activity changes, and distinctive thermal response behaviors, respectively, indicating presence of malignancy.
  • Case 4 shown in Fig. 21 D, is a 62-y ear-old postmenopausal patient who presented with a suspicious lump in her right breast.
  • Ultrasound, mammography, and digital breast tomosynthesis were performed as part of her routine diagnostic work-up.
  • a 1.8 cm measuring lump was detected in the 9 o’clock area of the right breast.
  • the distance to the skin surface was 7mm.
  • the distance to the nipple was > 5 cm.
  • An ultrasound guided core-cut biopsy showed an invasive triple negative breast cancer, with a Ki67 of 70%.
  • This area's raised temperature reflects the tumor's metabolic activity.
  • the calculated biomarkers in this case are TAI of 0.85, TES of 0.93, VAI of 0.92, DAS of 0.91, which reveal not just the abnormal thermal asymmetry but also the distinct textural and vascular changes, as well as the tissue's unique reaction to thermal stress, respectively, indicating presence of malignancy.
  • Case 5 shown in Fig. 21E, is a 61-year-old postmenopausal patient who presented at the breast unit for her routine checkup without any history of breast cancer. Ultrasound, mammography, and digital breast tomosynthesis were performed as part of her routine diagnostic work-up and were evaluated as BIRADS 2 each. In the thermal image analysis, no abnormal areas were detected. This healthy case reveals negligible thermal anomalies.
  • the calculated biomarkers for this case are TAI of 0.15, TES of 0.05, VAI of 0.10, DAS of 0.07, confirm the thermal stability and consistency characteristic of breast tissue free from pathological changes.
  • FIG. 22 is a flowchart of a method of generating the digital model representing the subject’s body region, according to some embodiments of the invention.
  • the method may be performed using a computing device such as computing device 230 of system 200 and/or any other suitable computing device.
  • a plurality of time-sequential sensor datasets may be received.
  • Each of the sensor datasets may include a plurality of data values representing the subject’s body region such as subject’s breast at a certain timestep in the screening time.
  • the plurality of time-sequential sensor datasets may be received from sensors of imaging devices (e.g., such as imaging devices 210 of system 200).
  • the plurality of time-sequential sensor datasets may be registered in a reference coordinate system.
  • the registration may include integration of data from sensors of the imaging devices to create the representation of the subject’s body region.
  • the registration may be at least partly based on calibration data of the sensors of the imaging devices (e.g., as described above with respect to Fig. 15A).
  • the registration may include spatial alignment of the plurality of time-sequential sensor datasets in the reference coordinate system (e.g., as described above with respect to Fig. 15A).
  • the plurality of time-sequential sensor datasets may be combined to provide a three-dimensional (3D) digital model (e.g., as described above with respect to Fig. 15A).
  • the 3D digital model may include a plurality of time-sequential datasets (e.g., video) representing in 3D the surface of the subject’s body region, the spectral (e.g., thermal and/or NIR) characteristics of the subject’s body region and the variation of the spectral characteristics of the subject’s body region over the screening time (e.g., such as 3D digital model 320 described above with respect to Fig. 15A).
  • a continuous 3D surface representing the subject’s body region such as the subject’s breast may be generated (e.g., as described above with respect to Fig. 15A).
  • computing device 301 may generate a 2D digital model (e.g., as described above with respect to Fig. 15A).
  • the 2D digital model may include a plurality of time-sequential datasets (e.g., images) (e.g., video) representing in 2D the surface of the subject’s body region, the spectral (e.g., thermal and/or NIR) characteristics of the subject’s body region and the variation of the spectral characteristics of the subject’s body region over the screening time (e.g., such as 2D digital model 330 described above with respect to Figs. 15A and 15B).
  • time-sequential datasets e.g., images
  • spectral e.g., thermal and/or NIR
  • data values of the plurality of time- sequential datasets of the 3D digital model may be modified to represent the surface of the subject’s body region and the spectral (e.g., IR and/or NIR) characteristics of the subject’s body region in 2D to generate the 2D digital model.
  • the modification may include parametrization such as harmonic parametrization, As- Rigid-As-Possible (ARAP) parametrization in 2D and/or any other suitable parametrization method to ensure minimal distortion in final 2D datasets (e.g., as described above with respect to Fig. 15A).
  • FIG. 23 is a flowchart of a method of determining an existence or absence of a physiological abnormality in a subject’s body region, according to some embodiments of the invention.
  • the method may be performed using a computing device such as computing device 230 of system 200 and/or any other suitable computing device.
  • a plurality of time-sequential datasets of a digital model may be received (e.g., such as datasets 412 of digital model 410 described above with respect to Fig. 16).
  • Each of the datasets may include a plurality of data values representing a surface and spectral characteristics of the subject’s body region at a certain timestep in a screening time.
  • a plurality of spatial features vectors may be calculated (e.g., such as spatial features embedded vectors 422 described above with respect to Fig. 16).
  • Each of the spatial features vector may include a plurality of spatial features indicative of a presence or absence of data values representing the physiological abnormality in a corresponding dataset of the plurality of time-sequential datasets.
  • a corresponding dataset of the plurality of time-sequential datasets may be provided as an input to a first machine learning model (e.g., such as first machine learning model 420 described above with respect to Fig. 16).
  • the spatial features vector may be a one-dimensional embedded features vector generated by the first machine learning model through processing of the corresponding dataset (e.g., as described above with respect to Fig. 16).
  • a temporal features vector may be calculated (e.g., temporal features vector 432 described above with respect to Fig. 16).
  • the temporal features vector may include a plurality of temporal features indicative of a temporal dependency of the spatial features of the plurality of spatial features vectors over the screening time (e.g., as described above with respect to Fig. 16).
  • the plurality of spatial features vectors may be provided as an input to a second machine learning model (e.g., second machine learning model 430 described above with respect to Fig. 16).
  • the temporal features may be a hidden state vector generated by the second machine learning model at a final timestep (e.g., as described above with respect to Fig. 16).
  • a probability of the existence or absence of the physiological abnormality in the subject’s body region may be calculated (e.g., as described above with respect to Fig. 16).
  • FIG. 24 is a flowchart of a method of calculating a biomarker indicative of an existence or absence of a physiological abnormality in a subject’s body region, according to some embodiments of the invention.
  • the method may be performed using a computing device such as computing device 230 of system 200 and/or any other suitable computing device.
  • a plurality of time-sequential datasets of a digital model may be received (e.g., such as datasets 332 of digital model 330 described above with respect to Figs. 15A-15B).
  • Each of the datasets may include a plurality of data values representing a surface and spectral characteristics of the subject’s body region at a certain timestep in a screening time.
  • a dataset of the plurality of time-sequential datasets may be selected.
  • the dataset may be selected based on its timestep label corresponding to a desired timestep in the screening time (e.g., as described above with respect to Figs. 17A, 18, 19A, 20A).
  • a first sub dataset may be detected (e.g., as described above with respect to Figs. 17A, 18, 19A, 20A).
  • the first sub dataset may include data values of the plurality of data values of the dataset representing a first subregion of the subject’s body region (e.g., as described above with respect to Figs. 17A, 18, 19A, 20A).
  • a second sub dataset may be detected (e.g., as described above with respect to Figs. 17A, 18, 19A, 20 A).
  • the second sub dataset may include data values of the plurality of data values of the dataset representing a second subregion of the subject’s body region (e.g., as described above with respect to Figs. 17A, 18, 19A, 20A).
  • a first set of spatial features of the first subregion of the subject’s region may be calculated (e.g., as described above with respect to Figs. 17A, 18, 19A, 20A).
  • a second set of spatial features of the first subregion of the subject’s region may be calculated (e.g., as described above with respect to Figs. 17A, 18, 19A, 20 A).
  • Each of the first set of spatial features and the second set of spatial features may include an embedded features vector and/or a computer vision features vector calculated based on the respective first sub dataset or the second sub dataset (e.g., as described above with respect to Figs. 17A, 18, 19A, 20 A).
  • the respective first sub dataset or the second sub dataset may be provided as an input to a machine learning model (e.g., such as machine learning model 520, 620, 820 described above with respect to Figs. 17A, 18, 19A, 20 A).
  • the embedded features vector of each of the first set of spatial features and the second set of spatial features may be a one-dimensional embedded features vector generated by the machine learning model through processing of the respective first sub dataset or the second sub dataset (e.g., as described above with respect to Figs. 17A, 18, 19A, 20A).
  • the computer vision features vector of each of the first set of spatial features and the second set of spatial features may include at least one of mean, variance, skewness, contrast, homogeneity and entropy of data values of the respective first data subset or the second data subset (e.g., as described above with respect to Figs. 17A, 18, 19A, 20A).
  • the biomarker indicative of the existence or absence of the physiological abnormality in the subject’s body region may be calculated (e.g., as described above with respect to Figs. 17A, 18, 19A, 20A).
  • the first set of spatial features vector and the second set of spatial features vector may be provided as an input to a machine learning model (e.g., such as machine learning model 530, 630, 730, 830 as described above with respect to Figs. 17A, 18, 19A, 20A).
  • Comparing the first subregion and the second subregion of the subject’s body region may reduce false positive determinations by ensuring that detected abnormalities are not just variations of normal anatomy but are indeed significant differences indicative of potential physiological abnormalities such as breast cancer. For example, computing the mean temperature on an asymmetric dataset (e.g., image) without separating the dataset into the first subregion and the second subregion may yield a result close to zero, even if there is high asymmetry in the dataset (e.g., image).
  • an asymmetric dataset e.g., image
  • a first biomarker indicative of the existence or absence of the physiological abnormality in the first subregion of the subject’s body region may be calculated based on the first set of spatial features (e.g., by providing the first set of spatial features as an input to a machine learning mode such as machine learning model 530, 630, 730, 830 as described above with respect to Figs. 17A, 18, 19A, 20A).
  • a second biomarker indicative of the existence or absence of the physiological abnormality in the second subregion of the subject’s body region may be calculated based on the second set of spatial features (e.g., by providing the second set of spatial features as an input to a machine learning mode such as machine learning model 530, 630, 730, 830 as described above with respect to Figs. 17A, 18, 19A, 20A).
  • a machine learning mode such as machine learning model 530, 630, 730, 830 as described above with respect to Figs. 17A, 18, 19A, 20A.
  • Fig. 25 a flowchart of a method of calculating the thermal asymmetry index (TAI) biomarker indicative of the existence or absence of the physiological abnormality in the subject’s body region, according to some embodiments of the invention.
  • TAI thermal asymmetry index
  • the method may be performed using a computing device such as computing device 230 of system 200 and/or any other suitable computing device.
  • the TAI biomarker (e.g., such as TAI biomarker 532) may be indicative of the asymmetry (e.g., deviations in symmetry) between the first and second subregions of the subject’s body region such as the right and left breasts, which may suggest the existence of underlying physiological abnormality such as breast cancer (e.g., as described above with respect to Fig. 17A).
  • asymmetry e.g., deviations in symmetry
  • a plurality of time-sequential datasets of a digital model may be received (e.g., such as datasets 332 of digital model 330 described above with respect to Figs. 15A-15B).
  • Each of the datasets may include a plurality of data values representing a surface and spectral characteristics of the subject’s body region at a certain timestep in a screening time.
  • a dataset (e.g., such as dataset 510 described above with respect to Fig. 17A) of the plurality of time-sequential datasets may be selected.
  • the selected dataset may be the dataset acquired at the end of acclimatization phase and before the cooling phase of the screening procedure (e.g., as described above with respect to Fig. 17A).
  • a mirrored dataset (e.g., mirrored dataset 512 described above with respect to Fig. 17A) may be calculated (e.g., as described above with respect to Fig. 17A).
  • the mirrored dataset may provide a mirrored representation of the dataset.
  • a difference dataset (e.g., such as difference dataset 514) may be calculated (e.g., as described above with respect to Fig. 17A).
  • Difference dataset 514 may include a plurality of data values representing a difference between the dataset and the mirrored dataset.
  • a first sub dataset (e.g., first sub dataset 514a) representing a first subregion of the subject’s body region may be calculated.
  • a second sub dataset (e.g., second sub dataset 514b) representing a second subregion of the subject’s body region may be calculated.
  • a first set of spatial features (e.g., such as first set of spatial features 522) may be calculated (e.g., as described above with respect to Fig. 17A).
  • a second set of spatial features (e.g., such as second set of spatial features 524) may be calculated (e.g., as described above with respect to Fig. 17A).
  • Each of the first set of spatial features and the second set of spatial features may include an embedded features vector (e.g., first and second embedded features vectors 522a, 524a, respectively) and/or a computer vision features vector (e.g., first and second computer vision vectors 522b, 524b, respectively) calculated based on the respective first sub dataset or the second sub dataset (e.g., as described above with respect to Fig. 17A).
  • the respective first sub dataset or the second sub dataset may be provided as an input to a machine learning model (e.g., such as machine learning model 520 described above with respect to Figs. 17A).
  • the embedded features vector of each of the first set of spatial features and the second set of spatial features may be a one-dimensional embedded features vector generated by the machine learning model through processing of the respective first sub dataset or the second sub dataset (e.g., as described above with respect to Fig. 17A).
  • the computer vision features vector of each of the first set of spatial features and the second set of spatial features may include at least one of mean, variance, skewness, contrast, homogeneity and entropy of data values of the respective first data subset or the second data subset (e.g., as described above with respect to Fig. 17A).
  • the TAI biomarker (e.g., TAI biomarker 532) may be calculated (e.g., as described above with respect to Fig. 17A).
  • the first set of spatial features vector and the second set of spatial features vector may be provided as an input to a machine learning model (e.g., such as machine learning model 530 described above with respect to Figs. 17A).
  • a first TAI biomarker may be calculated for the first subregion of the subject’s body region based on the first set of spatial features (e.g., by providing the first set of spatial features as an input to a machine learning mode such as machine learning model 530 as described above with respect to Fig. 17A).
  • a second TAI biomarker may be calculated for the second subregion of the subject’s body region based on the second set of spatial features (e.g., by providing the second set of spatial features as an input to a machine learning mode such as machine learning model 530 as described above with respect to Fig. 17A).
  • Fig. 26 is a flowchart of a method of calculating the thermal entropy score (TES) biomarker indicative of the existence or absence of the physiological abnormality in the subject’s body region, according to some embodiments of the invention.
  • TES thermal entropy score
  • the method may be performed using a computing device such as computing device 230 of system 200 and/or any other suitable computing device.
  • TES biomarker e.g., such as TES biomarker 632
  • TES biomarker 632 may be indicative of irregular thermal patterns which may indicate abnormal tissue behavior or metabolic activity in the subject’s body region, which in turn may suggest the existence of underlying physiological abnormality such as breast cancer (e.g., as described above with respect to Fig. 18).
  • a plurality of time-sequential datasets of a digital model may be received (e.g., such as datasets 332 of digital model 330 described above with respect to Figs. 15A-15B).
  • Each of the datasets may include a plurality of data values representing a surface and spectral characteristics of the subject’s body region at a certain timestep in a screening time.
  • a dataset (e.g., such as dataset 610 described above with respect to Fig. 18) of the plurality of time-sequential datasets may be selected.
  • the selected dataset may be the dataset acquired at the end of acclimatization phase and before the cooling phase of the screening procedure (e.g., as described above with respect to Fig. 18).
  • a first sub dataset (e.g., such as first sub dataset 614) may be detected.
  • the first sub dataset may include a first subset of data values of the dataset that represent a first subregion of the subject’s body region.
  • a second sub dataset (e.g., such as second sub dataset 614) may be detected.
  • the second sub dataset may include a first subset of data values of the dataset that represent a first subregion of the subject’s body region.
  • data values e.g., data values 612a
  • data values 612a representing hotspots in the first subregion of the subject’s body region
  • data values e.g., data values 614a
  • data values 614a representing hotspots in the second subregion of the subject’s body region
  • a first set of spatial features (e.g., such as first set of spatial features 622) may be calculated (e.g., as described above with respect to Fig. 18).
  • a second set of spatial features (e.g., such as second set of spatial features 624) may be calculated (e.g., as described above with respect to Fig. 18).
  • Each of the first set of spatial features and the second set of spatial features may include an embedded features vector (e.g., first and second embedded features vectors 622a, 624a, respectively) and/or a computer vision features vector (e.g., first and second computer vision vectors 622b, 624b, respectively), and a hotspots features vector (e.g., such as first and second hotspot features vector 622c, 624c, respectively) calculated based on the respective first sub dataset or the second sub dataset (e.g., as described above with respect to Fig. 18).
  • an embedded features vector e.g., first and second embedded features vectors 622a, 624a, respectively
  • a computer vision features vector e.g., first and second computer vision vectors 622b, 624b, respectively
  • a hotspots features vector e.g., such as first and second hotspot features vector 622c, 624c, respectively
  • the respective first sub dataset or the second sub dataset may be provided as an input to a machine learning model (e.g., such as machine learning model 620 described above with respect to Figs. 18).
  • the embedded features vector of each of the first set of spatial features and the second set of spatial features may be a one-dimensional embedded features vector generated by the machine learning model through processing of the respective first sub dataset or the second sub dataset (e.g., as described above with respect to Fig. 18).
  • the computer vision features vector of each of the first set of spatial features and the second set of spatial features may include at least one of mean, variance, skewness, contrast, homogeneity and entropy of data values of the respective first data subset or the second data subset (e.g., as described above with respect to Fig. 18).
  • the hotspots features vector of each of the first set of spatial features and the second set of spatial features may include a plurality of values (e.g., features) such as number of hotspots, mean size of hotspots, mean measure of deviation of hotspots’ shape from their best-fit ellipse shape, mean temperature difference between the hotspots and the hotspots’ surrounding (e.g., as described above with respect to Fig. 18). These values or features may be calculated based on the data values representing the hotspots of the first subregion and the second subregion of the subject’s body, respectively (e.g., as described above with respect to Fig. 18).
  • the TES biomarker (e.g., TES biomarker 632) may be calculated (e.g., as described above with respect to Fig. 18).
  • the first set of spatial features vector and the second set of spatial features vector may be provided as an input to a machine learning model (e.g., such as machine learning model 630 described above with respect to Fig. 18).
  • a first TES biomarker may be calculated for the first subregion of the subject’s body region based on the first set of spatial features (e.g., by providing the first set of spatial features as an input to a machine learning mode such as machine learning model 630 as described above with respect to Fig. 18).
  • a second TES biomarker may be calculated for the second subregion of the subject’s body region based on the second set of spatial features (e.g., by providing the second set of spatial features as an input to a machine learning mode such as machine learning model 630 as described above with respect to Fig. 18).
  • Fig. 27 is a flowchart of a method of calculating a vascular activity indicator (VAI) biomarker indicative of the existence or absence of the physiological abnormality in the subject’s body region, according to some embodiments of the invention.
  • VAI vascular activity indicator
  • the method may be performed using a computing device such as computing device 230 of system 200 and/or any other suitable computing device.
  • the VAI biomarker (e.g., such as VAI biomarker 732) may be indicative of irregular vessels network patterns which may suggest the existence of underlying physiological abnormality such as breast cancer (e.g., as described above with respect to Fig. 19A).
  • a plurality of time-sequential datasets of a digital model may be received (e.g., such as datasets 332 of digital model 330 described above with respect to Figs. 15A-15B).
  • Each of the datasets may include a plurality of data values representing a surface and spectral characteristics of the subject’s body region at a certain timestep in a screening time.
  • a dataset (e.g., such as dataset 710 described above with respect to Fig. 19A) of the plurality of time-sequential datasets may be selected.
  • the selected dataset may be the dataset acquired at the end of acclimatization phase and before the cooling phase of the screening procedure (e.g., as described above with respect to Fig. 19A).
  • a vessels map dataset (e.g., such as a vessels map dataset 712) may be generated (e.g., as described above with respect to Fig. 19A).
  • a vessels map dataset e.g., such as a vessels map dataset 712
  • data values of the dataset representing vessels in the subject’s body region such as the subject’s breast may be detected and the vessels map may be generated based on these data values (e.g., as described above with respect to Fig. 19A).
  • a first sub dataset (e.g., first sub dataset 712a) may be generated (e.g., as described above with respect to Fig. 19A).
  • the first sub dataset may include a first subset of data values of the vessels map dataset that represent a first subregion of the subject’s body region (e.g., as described above with respect to Fig. 19A).
  • a second sub dataset (e.g., second sub dataset 712b) may be generated (e.g., as described above with respect to Fig. 19A).
  • the second sub dataset may include a second subset of data values of the vessels map dataset that represent a second subregion of the subject’s body region (e.g., as described above with respect to Fig. 19A).
  • a first set of spatial features (e.g., such as first set of spatial features 722) may be calculated (e.g., as described above with respect to Fig. 19A).
  • a second set of spatial features (e.g., such as second set of spatial features 424) may be calculated (e.g., as described above with respect to Fig. 19A).
  • Each of the firs set of spatial features and the second set of spatial features may include a plurality of values (e.g., features) related to the vessels in the respective first subregion or the second subregion of the subject’s body region.
  • Each of the first set of spatial features and the second set of spatial features may include values (e.g., features) such as total length, perimeter, number of intersection points, fractal dimension, median tortuosity and/or any other suitable feature.
  • the VAI biomarker (e.g., VAI biomarker 732) may be calculated (e.g., as described above with respect to Fig. 19A).
  • the first set of spatial features vector and the second set of spatial features vector may be provided as an input to a machine learning model (e.g., such as machine learning model 730 described above with respect to Fig. 19A).
  • a first VAI biomarker may be calculated for the first subregion of the subject’s body region based on the first set of spatial features (e.g., by providing the first set of spatial features as an input to a machine learning mode such as machine learning model 730 as described above with respect to Fig. 19A).
  • a second VAI biomarker may be calculated for the second subregion of the subject’s body region based on the second set of spatial features (e.g., by providing the second set of spatial features as an input to a machine learning mode such as machine learning model 730 as described above with respect to Fig. 19A).
  • Fig. 28 a flowchart of a method of calculating a dynamic anomaly score (DAS) biomarker indicative of the existence or absence of the physiological abnormality in the subject’s body region, according to some embodiments of the invention.
  • the method may be performed using a computing device such as computing device 230 of system 200 and/or any other suitable computing device.
  • the DAS biomarker (e.g., DAS biomarker 832) may be indicative of the dynamic thermal reaction and physiological response of the subject’s body region such as the subject’s breast to the induced thermal stress (e.g., as described above with respect to Fig. 20A).
  • a plurality of time-sequential datasets of a digital model may be received (e.g., such as datasets 332 of digital model 330 described above with respect to Figs. 15A-15B).
  • Each of the datasets may include a plurality of data values representing a surface and spectral characteristics of the subject’s body region at a certain timestep in a screening time.
  • a first sub dataset (e.g., first sub dataset 814a) representing a first subregion of the subject’s body region may be calculated.
  • a second sub dataset (e.g., second sub dataset 514b) representing a second subregion of the subject’s body region may be calculated.
  • a first set of spatial features e.g., such as first set of spatial features 822
  • a second set of spatial features e.g., such as second set of spatial features 824
  • Each of the first set of spatial features and the second set of spatial features may include an embedded features vector (e.g., first and second embedded features vectors 822a, 824a, respectively) and/or a computer vision features vector (e.g., first and second computer vision vectors 822b, 824b, respectively) calculated based on the respective first sub dataset or the second sub dataset (e.g., as described above with respect to Fig. 20 A).
  • an embedded features vector e.g., first and second embedded features vectors 822a, 824a, respectively
  • a computer vision features vector e.g., first and second computer vision vectors 822b, 824b, respectively
  • the respective first sub dataset or the second sub dataset may be provided as an input to a machine learning model (e.g., such as machine learning model 820 described above with respect to Figs. 20A).
  • the embedded features vector of each of the first set of spatial features and the second set of spatial features may be a one-dimensional embedded features vector generated by the machine learning model through processing of the respective first sub dataset or the second sub dataset (e.g., as described above with respect to Fig. 20A).
  • the computer vision features vector of each of the first set of spatial features and the second set of spatial features may include at least one of mean, variance, skewness, contrast, homogeneity and entropy of data values of the respective first data subset or the second data subset (e.g., as described above with respect to Fig. 20A).
  • the DAS biomarker (e.g., DAS biomarker 532) may be calculated (e.g., as described above with respect to Fig. 20A).
  • the first set of spatial features vector and the second set of spatial features vector may be provided as an input to a machine learning model (e.g., such as machine learning model 830 described above with respect to Fig. 20A).
  • the DAS biomarker may be further calculated based on an additional dataset other than the difference (TRG) dataset (as described above with respect to Fig. 20A).
  • the additional dataset may include a variance dataset (e.g., indicative of the thermal fluctuation intensity throughout the recovery or stabilization phase and the uniformity of the recovery process), a skewness and kurtosis dataset (e.g., indicative of the asymmetry and peaks of the temperature evolution and the measure of even or uneven temperature rise in the stabilization phase) and/or an area under the curve (AUC) dataset (e.g., indicative of the total heat recovery over the observed period and of the measure of recovery rate).
  • AUC area under the curve
  • the additional dataset may be calculated based on all intermediate datasets between the first dataset and the second datasets of the digital model belonging to the stabilization (e.g., recovery) phase of the screening procedure (e.g., based on all datasets).
  • the intermediate datasets may be stabilized (e.g., spatially aligned) with respect to the first dataset (e.g., to isolate thermal variations from physical movements, ensuring any observable changes are due to temperature fluctuations over time). This may allow each data value (e.g., pixel) within the intermediate datasets to be transformed into a discrete time series that delineates the thermal evolution of that specific point on the skin surface.
  • the additional dataset may be then processed in the same way as the difference dataset to determine at least one additional first set of spatial features and at least one additional second set of spatial features, which can be fed to the machine learning model together with the first set of spatial features and the second set of spatial features to calculate The DAS biomarker.
  • the DAS biomarker may be calculated based on the difference dataset and at least one of or all of the additional datasets including the variance dataset, a skewness and kurtosis dataset and/or the AUC dataset.
  • a first DAS biomarker may be calculated for the first subregion of the subject’s body region based on the first set of spatial features (e.g., by providing the first set of spatial features as an input to a machine learning mode such as machine learning model 830 as described above with respect to Fig. 20A).
  • a second DAS biomarker may be calculated for the second subregion of the subject’s body region based on the second set of spatial features (e.g., by providing the second set of spatial features as an input to a machine learning mode such as machine learning model 830 as described above with respect to Fig. 20A).
  • Embodiments of the present invention may provide a continuous, real-time monitoring of pathophysiological changes in the subject’s body region such as the subject’s breast.
  • This comprehensive temporal analysis, augmented by a multisensory approach may translate continuous dynamics into the DAS biomarker.
  • a distinctive feature of our approach is the utilization of the digital model concept, which, in some embodiments, may synthesize data from multiple sensors into a cohesive, two-dimensional image of the subject’s body region. This technique not only mitigates the issues related to patient movement during screening but also may address the challenge of data redundancy, which arises when multiple cameras capture the same anatomical region.
  • the incorporation of NIR imaging into the multimodal strategy may enhance the visualization and analysis of vascular structures, surpassing the capabilities of traditional thermal imaging alone.
  • the digital model framework significantly improves the precision of asymmetry assessments by providing a detailed, unified view of the breast.
  • This integrated approach coupled with the identification and quantification of specific digital biomarkers, may provide an improved screening risk frame that boasts improved diagnostic accuracy and interpretability.
  • FIG. 30 is a block diagram of an exemplary computing device which may be used with embodiments of the present invention.
  • Computing device 1600 may include a controller or processor 1605 that may be, for example, a central processing unit processor (CPU), a chip or any suitable computing or computational device, an operating system 1615, a memory 1620, a storage 1630, input devices 1635 and output devices 1640.
  • a controller or processor 1605 may be, for example, a central processing unit processor (CPU), a chip or any suitable computing or computational device, an operating system 1615, a memory 1620, a storage 1630, input devices 1635 and output devices 1640.
  • Operating system 1615 may be or may include any code segment designed and/or configured to perform tasks involving coordination, scheduling, arbitration, supervising, controlling or otherwise managing operation of computing device 1600, for example, scheduling execution of programs.
  • Memory 1620 may be or may include, for example, a Random Access Memory (RAM), a read only memory (ROM), a Dynamic RAM (DRAM), a Synchronous DRAM (SD-RAM), a double data rate (DDR) memory chip, a Flash memory, a volatile memory, a non-volatile memory, a cache memory, a buffer, a short term memory unit, a long term memory unit, or other suitable memory units or storage units.
  • Memory 1620 may be or may include a plurality of, possibly different, memory units.
  • Memory 1620 may store for example, instructions to carry out a method (e.g., code 1625), and/or data such as user responses, interruptions, etc.
  • Executable code 1625 may be any executable code, e.g., an application, a program, a process, task or script. Executable code 1625 may be executed by controller 1605 possibly under control of operating system 1615. In some embodiments, more than one computing device 1600 or components of device 1600 may be used for multiple functions described herein. For the various modules and functions described herein, one or more computing devices 1600 or components of computing device 1600 may be used. Devices that include components similar or different to those included in computing device 1600 may be used, and may be connected to a network and used as a system. One or more processor(s) 1605 may be configured to carry out embodiments of the present invention by for example executing software or code.
  • Storage 1630 may be or may include, for example, a hard disk drive, a floppy disk drive, a Compact Disk (CD) drive, a CD-Recordable (CD-R) drive, a universal serial bus (USB) device or other suitable removable and/or fixed storage unit. In some embodiments, some of the components shown in Fig. 29 may be omitted.
  • Input devices 1635 may be or may include a mouse, a keyboard, a touch screen or pad or any suitable input device. It will be recognized that any suitable number of input devices may be operatively connected to computing device 1600 as shown by block 135.
  • Output devices 1640 may include one or more displays, speakers and/or any other suitable output devices. It will be recognized that any suitable number of output devices may be operatively connected to computing device 1600 as shown by block 1640. Any applicable input/output (I/O) devices may be connected to computing device 1600, for example, a wired or wireless network interface card (NIC), a modem, printer or facsimile machine, a universal serial bus (USB) device or external hard drive may be included in input devices 1635 and/or output devices 1640.
  • NIC network interface card
  • USB universal serial bus
  • Embodiments of the invention may include one or more article(s) (e.g., memory 1620 or storage 1630) such as a computer or processor non-transitory readable medium, or a computer or processor non-transitory storage medium, such as for example a memory, a disk drive, or a USB flash memory, encoding, including or storing instructions, e.g., computer-executable instructions, which, when executed by a processor or controller, carry out methods disclosed herein.
  • article(s) e.g., memory 1620 or storage 1630
  • a computer or processor non-transitory readable medium such as for example a memory, a disk drive, or a USB flash memory
  • encoding including or storing instructions, e.g., computer-executable instructions, which, when executed by a processor or controller, carry out methods disclosed herein.
  • the terms “plurality” and “a plurality” as used herein can include, for example, “multiple” or “two or more”.
  • the terms “plurality” or “a plurality” can be used throughout the specification to describe two or more components, devices, elements, units, parameters, or the like.
  • the term set when used herein can include one or more items.
  • the method embodiments described herein are not constrained to a particular order or sequence. Additionally, some of the described method embodiments or elements thereof can occur or be performed simultaneously, at the same point in time, or concurrently.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Medical Informatics (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Gynecology & Obstetrics (AREA)
  • Reproductive Health (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)

Abstract

Système permettant de déterminer l'existence ou l'absence d'une anomalie physiologique dans une région corporelle d'un sujet, qui peut comprendre : un ou plusieurs dispositifs d'imagerie configurés pour générer des données représentant la région corporelle du sujet sur un temps de criblage ; et un dispositif informatique configuré pour : sur la base des données, générer un modèle numérique de la région corporelle du sujet, le modèle numérique représentant une surface et des caractéristiques thermiques de la région corporelle du sujet sur le temps de criblage ; sur la base du modèle numérique, déterminer des caractéristiques spatiales et/ou des caractéristiques de caractéristiques indiquant une présence ou une absence de valeurs de données dans le modèle numérique représentant l'anomalie physiologique ; et sur la base des caractéristiques spatiales et/ou des caractéristiques temporelles, déterminer une probabilité et/ou un biomarqueur indiquant l'existence ou l'absence de l'anomalie physiologique dans la région corporelle du sujet.
PCT/IB2024/055047 2023-05-24 2024-05-23 Système et procédé permettant de déterminer l'existence d'une anomalie physiologique dans un corps d'un sujet Pending WO2024241277A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IL303207A IL303207A (en) 2023-05-24 2023-05-24 A system and method for thermal breast cancer screening
IL303207 2023-05-24

Publications (1)

Publication Number Publication Date
WO2024241277A1 true WO2024241277A1 (fr) 2024-11-28

Family

ID=93589057

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2024/055047 Pending WO2024241277A1 (fr) 2023-05-24 2024-05-23 Système et procédé permettant de déterminer l'existence d'une anomalie physiologique dans un corps d'un sujet

Country Status (2)

Country Link
IL (1) IL303207A (fr)
WO (1) WO2024241277A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070036402A1 (en) * 2005-07-22 2007-02-15 Cahill Nathan D Abnormality detection in medical images
US20160100790A1 (en) * 2014-10-08 2016-04-14 Revealix, Inc. Automated systems and methods for skin assessment and early detection of a latent pathogenic bio-signal anomaly
US20170311835A1 (en) * 2010-04-08 2017-11-02 The Regents Of The University Of California Method and system for detection of biological rhythm disorders
US20180028079A1 (en) * 2016-07-29 2018-02-01 Novadaq Technologies Inc. Methods and systems for characterizing tissue of a subject utilizing a machine learning
US20200302603A1 (en) * 2017-12-05 2020-09-24 Ventana Medical Systems, Inc. Method of computing tumor spatial and inter-marker heterogeneity

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070036402A1 (en) * 2005-07-22 2007-02-15 Cahill Nathan D Abnormality detection in medical images
US20170311835A1 (en) * 2010-04-08 2017-11-02 The Regents Of The University Of California Method and system for detection of biological rhythm disorders
US20160100790A1 (en) * 2014-10-08 2016-04-14 Revealix, Inc. Automated systems and methods for skin assessment and early detection of a latent pathogenic bio-signal anomaly
US20180028079A1 (en) * 2016-07-29 2018-02-01 Novadaq Technologies Inc. Methods and systems for characterizing tissue of a subject utilizing a machine learning
US20200302603A1 (en) * 2017-12-05 2020-09-24 Ventana Medical Systems, Inc. Method of computing tumor spatial and inter-marker heterogeneity

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
FAROOQ MUHAMMAD ALI; CORCORAN PETER: "Infrared Imaging for Human Thermography and Breast Tumor Classification using Thermal Images", 2020 31ST IRISH SIGNALS AND SYSTEMS CONFERENCE (ISSC), IEEE, 11 June 2020 (2020-06-11), pages 1 - 6, XP033818305, DOI: 10.1109/ISSC49989.2020.9180164 *
LAHIRI ET AL.: "Medical applications of infrared thermography: a review", INFRARED PHYSICS & TECHNOLOGY, vol. 55, no. 4, 2012, pages 221 - 235, XP055448144, Retrieved from the Internet <URL:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7110787> [retrieved on 20240720], DOI: 10.1016/j.infrared.2012.03.007 *
ZERAATKAR MOJTABA, KHALILI KHALIL: "A Fast and Low-Cost Human Body 3D Scanner Using 100 Cameras", JOURNAL OF IMAGING, MDPI AG, vol. 6, no. 4, pages 21, XP093243384, ISSN: 2313-433X, DOI: 10.3390/jimaging6040021 *
ZHANG ET AL.: "Radiological images and machine learning: trends, perspectives, and prospects", COMPUTERS IN BIOLOGY AND MEDICINE, vol. 108, 2019, pages 354 - 370, XP085691877, Retrieved from the Internet <URL:https://www.sciencedirect.com/science/article/abs/pii/S0010482519300642> [retrieved on 20240720], DOI: 10.1016/j.compbiomed.2019.02.017 *

Also Published As

Publication number Publication date
IL303207A (en) 2024-12-01

Similar Documents

Publication Publication Date Title
AU2017292642B2 (en) System and method for automatic detection, localization, and semantic segmentation of anatomical objects
Masood et al. Computer-assisted decision support system in pulmonary cancer detection and stage classification on CT images
US12482248B2 (en) Multi-modal method for classifying thyroid nodule based on ultrasound and infrared thermal images
US11854200B2 (en) Skin abnormality monitoring systems and methods
US9754371B2 (en) Multi modality brain mapping system (MBMS) using artificial intelligence and pattern recognition
CN103542935B (zh) 用于实时的组织氧合测量的小型化多光谱成像器
US20150078642A1 (en) Method and system for non-invasive quantification of biologial sample physiology using a series of images
US8494227B2 (en) System and method for using three dimensional infrared imaging to identify individuals
CN112825619B (zh) 使用数字重建放射影像训练机器学习算法的方法及系统
US12138044B2 (en) Systems and methods for determining subject positioning and vital signs
CN113473896A (zh) 分析主体
CN107205663A (zh) 用于皮肤检测的设备、系统和方法
Gururajarao et al. Infrared thermography and soft computing for diabetic foot assessment
González et al. An approach for thyroid nodule analysis using thermographic images
Francis et al. Detection of breast abnormality using rotational thermography
US20220343497A1 (en) Burn severity identification and analysis through three-dimensional surface reconstruction from visible and infrared imagery
US20250025052A1 (en) Vision-Based Cardiorespiratory Monitoring
WO2024241277A1 (fr) Système et procédé permettant de déterminer l&#39;existence d&#39;une anomalie physiologique dans un corps d&#39;un sujet
García-Torres et al. Advancing newborn care: Precise time of birth detection using ai-driven thermal imaging with adaptive normalization
US20240087115A1 (en) Machine learning enabled system for skin abnormality interventions
Hsu et al. Multi-modal fusion in thermal imaging and MRI for early cancer detection
Cao et al. A novel heart rate estimation framework with self-correcting face detection for Neonatal Intensive Care Unit
WO2022087132A1 (fr) Systèmes et procédés de surveillance d&#39;anomalie de la peau
CN117731244B (zh) 一种基于红外热成像的脊柱侧弯风险预警系统
CN119418070B (zh) 基于足底表面数据的内部骨骼关键点预测方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24810582

Country of ref document: EP

Kind code of ref document: A1