WO2025076539A1 - Système activé par réseau neuronal pour estimer l'âge au moment du décès au moyen d'images radiographiques - Google Patents
Système activé par réseau neuronal pour estimer l'âge au moment du décès au moyen d'images radiographiques Download PDFInfo
- Publication number
- WO2025076539A1 WO2025076539A1 PCT/US2024/050257 US2024050257W WO2025076539A1 WO 2025076539 A1 WO2025076539 A1 WO 2025076539A1 US 2024050257 W US2024050257 W US 2024050257W WO 2025076539 A1 WO2025076539 A1 WO 2025076539A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- age
- neural network
- convolutional
- radiographic images
- predicted age
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/20—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/50—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications
- A61B6/505—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications for diagnosis of bone
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/52—Devices using data or image processing specially adapted for radiation diagnosis
- A61B6/5211—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
- A61B6/5217—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data extracting a diagnostic or physiological parameter from medical diagnostic data
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/0464—Convolutional networks [CNN, ConvNet]
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/048—Activation functions
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/084—Backpropagation, e.g. using gradient descent
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/40—ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/30—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/70—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for mining of medical data, e.g. analysing previous cases of other patients
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
Definitions
- Transition analysis which seeks to obtain estimates of a trait’s presence within a population independently of the age distribution of the reference sample, has offered some solutions to these problems and represents an improvement upon some methodologies.
- Milner and Boldsen s transition analysis estimates still bear witness to the difficulties of identifying age-at-death for individuals who were older than age 60 when they died (G. R. Milner & Boldsen, 2012).
- Convolutional neural networks are a class of artificial neural network that are often applied to analyze visual imagery (Lecun & Bengio, 1995).
- the utility of all neural networks, including convolutional neural networks, is their ability to approximate any function mapping an input and an output (e.g., predicting sex from a radiograph where the input is the image and the output is the sex) through a series of complex computations (Funahashi, 1989).
- convolutional neural networks consist of an input layer (that is used to feed in the input image), hidden layers (that perform computations that approximate the desired function), and an output layer (that will classify the image into the desired output classes).
- Convolutional models are unique from other neural network architecture in that they apply their computations in a way that successfully captures spatial dependencies in an image (called convolutions or kernels) while minimizing the number parameters required for the model (Lecun & Bengio, 1995). These convolution kernels ‘slide’ across the image to generate feature maps (i.e., ‘processed images’) that are fed into subsequent layers; neural networks are often multiple layers deep.
- feature maps i.e., ‘processed images’
- neural networks are often multiple layers deep.
- the model can extract the high-level features (e.g., edges, outlines) that can be used for prediction.
- Deep learning techniques including convolutional neural networks, have been very successful in extracting biological age from medical data (Pyrkov et aL, 2018),.
- deep neural networks have successfully predicted sex and chronological age (to within 2.1 years) on the basis of healthy adult chest radiographs (Yang et al., 2021).
- the anatomical regions most important for the age prediction model were the spine, ribs, aortic arch, heart, and soft tissue of the thorax.
- the disclosed system predicts the biological age of skeletal remains using a convolutional neural network (trained, for example, using 693 radiographs from 136 adults interred in lead coffins in the eighteenth and nineteenth centuries in the crypt of London’s St. Bride’s Church). Additionally, to increase explainability and minimize the risk of overfitting, the disclosed system uses backpropagation to generate heatmaps that localize relevant regions of the skeletal remains that are class-discriminative (i.e. , most important to the model in predicting the age-at-death).
- FIG. 1 A is a block diagram of a neural network-enabled system for estimating age-at- death using radiographic images according to exemplary embodiments.
- FIG. IB is a diagram of the example system of FIG. 1A in greater detail.
- FIG. 2 is a diagram of an architecture of the system of FIG. 1 according to exemplary embodiments.
- FIGS. 3A through 3E are heatmaps generated by the disclosed system using radiographic images of bones of a female individual, age-at-death of 63, including a humeri (FIG. 3A), a pelvis (FIG. 3B), a tibiae (FIG. 3C), a right femur (FIG. 3D), and a left femur (FIG. 3E).
- FIGS. 4A through 4E are heatmaps generated by the disclosed system using radiographic images of bones of a male individual, age-at-death of 75, including a humeri (FIG. 4A), a pelvis (FIG. 4B), a tibiae (FIG. 4C), a right femur (FIG. 4D), and a left femur (FIG. 4E).
- FIGS. 5A and 5B are heatmaps generated by the disclosed system using radiographic images of bones of a female individual, age-at-death of 54, including a right and left humerus (FIG. 5A) and a pelvis (FIG. 5B).
- FIGS. 6A through 6D are heatmaps generated by the disclosed system using radiographic images of bones of a male individual, age-at-death of 34, including a pelvis (FIG. 6A), a right femur (FIG. 6B), a left femur (FIG. 6C), and a tibia (FIG. 6D).
- FIG. 1A is a block diagram of a neural network-enabled system 100 for estimating age-at-death 150 using radiographic images 110 according to exemplary embodiments.
- the system 100 includes a neural network 140 trained using training data 180, a preprocessing module 120, and a heatmap generation module 170.
- the neural network 140 may be, for example, a convolutional neural network 140 constructed using Tensorflow v2.8.
- the neural network 140 takes in a radiographic image 110 of any bone as input and generates a predicted age 150 (e.g., an age range in ten-year increments) at which the individual died.
- Each input radiographic image 110 may be a 2D matrix with 3 channels encoding the color of the image.
- the neural network 140 includes an input layer 141, a series of 2D convolutional layers 142 and 2D max pooling layers 143, a flatten layer 145, a dense fully connected layer 146, a dropout layer 147, and an output layer 148.
- the input 141 to the first convolutional layer 142 may be a 480 x 640 x 3 image, where 480 x 640 corresponds to the standardized size of the radiographic image 110 output by the preprocessing module 120 (discussed below) and 3 denotes the number of (RGB color) channels.
- the first convolutional layer 142 may have 64 filters (or equivalently, kernels) of size 2 x 2 x 64, where 2 x 2 denotes the size of the filter and 64 denotes the number of channels for that filter, and a non-linear rectify activation function (e.g., an He uniform variance scaling initializer).
- the output of each filter may be a locally connected structure, convolved with the input radiographic image 110, to produce 64 feature maps, which may then be max pooled with the output of other filters from the convolutional layer 142 by the subsequent max pooling layer 143. These feature maps then serve as input for the subsequent layer (e.g., a subsequent convolutional layer 142 and max pooling layer 143).
- Embodiments may include any number of convolutional layers 142 and max pooling layers 143 (e.g., four convolutional layer 142 and four max pooling layers 143).
- the convolutional neural network 140 includes a flatten layer 145 that converts the multi-dimensional output from the final max pooling layer 143 into a one-dimensional vector for processing by a dense fully connected layer 146.
- the dense fully connected layer may include, for example, 512 nodes and an ReLu activation.
- the convolutional neural network 140 also includes a dropout layer 147 - a model regularizer that limits co-adaptation and improves generalizability by randomly zero-ing input values - that randomly drops units and their connections (Hinton et al., 2012).
- the dropout probability of the dropout layer 147 may be set to 0.5.
- the output layer 148 may include a number of softmax output neurons 149 (e.g., seven softmax output neurons 149) corresponding to the estimated age 150 (e.g., less than 31 years of age, 31-40, 41-50, 51-60, 61-70, 71-80, and greater than 80 years old) of the individual depicted in the input radiographic image 110.
- the convolutional neural network 140 is trained using training data 180 that includes radiographic images 110 of bones of individuals having known ages 150 at death (and may also include other information, such as the sex 114 of each individual).
- the convolutional neural network 140 may be trained using the radiographic images 110 of the individuals interred in the crypt of St. Bride’s Church described below.
- the disclosed system 100 may include a preprocessing module 120 that removes all texts/labels (e.g., left-right markers and radiograph labels) from the radiographic images 110.
- the preprocessing module 120 may include an optical character recognition module 122 (e.g., pre-trained keras-ocr models) that obtain bounding box coordinates of all text on the radiographs) and a masking module 124 that replaces the text using an inpainting algorithm (e.g., realized using Open CV) to create a text- free radiographic image 110.
- the preprocessing module 120 may also include a standardization module 126 that standardizes the input size of the radiographic images 110 (e.g., to 480 x 640 pixels).
- the loss function used to optimize the model may be defined using categorical crossentropy, designed to quantify the difference between two probability distributions for multi-class prediction tasks.
- Model weights may be updated using Adam, an algorithm for gradient-based optimization of stochastic objective functions (Kingma & Ba, 2014).
- the model may be trained for 50 cycles, before exiting early using a validation set (e.g., 10 of the training data 180) and a patience of 3.
- each convolutional layer 142 generates feature maps k by applying an activation function and each max pooling layer 143 down-samples those features maps k (forming what is referred to in FIG. IB as down-sampled feature maps k').
- the flatten layer 145 takes the two-dimensional feature maps k' from the final max pooling layer 143 and flattens them into a one-dimensional vector V k .
- the fully connected layer 146 applies an activation function A Vk to the linear combination of the vector V k , weights W, and biases b.
- the output layer 148 converts the output F of the fully connected layer 148 into a probability distribution over the classes c (i.e., the predicted ages 150).
- a e F softmax function may calculate the probability y c pi where y c is the predicted probability for class c, F c is the output of the fully connected layer 148 for class c, and X; e F7 is the sum of the exponentials of the outputs for all classes (ensuring that the probabilities sum to 1).
- the class c is determined by selecting the class c with the highest probability c — argmax y c .
- the gradient of the loss function with respect to the output F is used to update the network’s weights W.
- the heatmap generation module 160 is realized using Grad-CAM visualization, where the radiographic image / (referred to as radiographic image 110 with reference to FIG. 1A) is propagated through the convolutional neural network 140 to generate the feature maps k.
- the gradient 156 for the determined class c i.e., the estimated age 150
- the gradients 156 for all other classes d are set to 0.
- a backpropagation process 165 is then used to focus on class c and calculate how changes in the feature maps k influence the probability y c .
- the gradient for the probability y c for class c is calculated with respect to the forward activation maps A k of the final convolution layer 142.
- those gradients are global average pooled (or global max pooled) over the width i and height j of the input image / to obtain the neuron importance weights a k — ⁇ .i Sy rr (where Z is the total number of pixels in the input image I).
- the backpropagation 165 and pooling 166 processes amount to successive matrix multiplications of the weight matrices and the gradients with respect to the activation functions A of the neural network 140 all the way through to the final convolution layer 142 where the gradients are being propagated.
- each weight a k c can be understood to represent the “importance” of each feature map k for classifying the radiographic image / as belonging to the target class c.
- Figures 3A-3E and 4A-4E provide typical examples of heatmaps for a female and a male individual for whom all bones were available and Figures 5A-5B and 6A-6D provide examples of heatmaps for a female and a male for whom fewer bones were available.
- Brighter areas reflect increased “attention” by the model.
- predictions tend to be based on the acetabular surface and the sacroiliac joints.
- articular surfaces were more difficult to assess, as epiphyses were often severely damaged or absent. In these cases, the model directs its attention to the diaphyseal cortex.
- FIG. 2 is a diagram of a hardware environment 200 of the disclosed system 100 according to exemplary embodiments.
- the system 100 may be realized as a hardware computing system 240, including non-transitory memory 248 storing instructions and at least one hardware computer processing unit 242 executing those instructions to perform the functions described herein.
- the computing system 240 may include any computing device capable of performing those functions (for example, a server, a personal computing device, etc.)
- the system 100 receives radiographic images 110 of bones of individuals and outputs an estimated age 150 of each individual.
- the radiographic images 110 may be received from a remote computer 210 via a network 250 (e.g., a local area network, the Internet) or via any wired or wireless communication link.
- the estimated age 150 and heatmaps 190 may be output, for example, via a graphical user interface.
- the computing system 240 may also include non- transitory computer readable storage media 280 (or communication with external storage media 280 via a wired, wireless, or network connection).
- the disclosed system 100 is capable of producing accurate and reliable estimates across the entire human life span, is simple to use and applicable to most (if not all) archaeological contexts, and is capable of significantly improving existing age-estimation methods in a non-destructive way.
- the disclosed system 100 enables users to “understand the model” and ensure that the model is focusing on actual bone features (in particular those that reflect degenerative change and are already key components of traditional aging methods) rather than making a prediction using irrelevant features (e.g., text within the radiographic image 110).
- the disclosed system 100 uses features of bone radiographs 110 that cannot be captured using prior art methods. As shown in the heatmaps 190 generated by the system 100, the model is trained to focus on diaphyses, which provide continuous stretches of cortical surface. Cortical thickness measurements, obtained from both radiographs and CT, have also been used by prior art systems to assess age and overall health. However, cortical assessments on the basis of radiographs typically can only measure the thickness. Prior art methods cannot assess cortical density, as the disclosed model may well be doing, because the cortical density registered by digital plate radiography is dependent upon the parameters used for image acquisition, which may be adjusted by the radiographer in order to facilitate an image with greater resolution.
- the disclosed system 100 is capable of using features of a bone radiograph 110 that likely could not be captured by other methods of analysis.
- FIG. 1 The figures are representative heatmap collections for female ( Figures 3A-3E and 5A-5B) and male ( Figures 4A-4E and 6A-6D) individuals for whom many bones ( Figures 3A-3E and 4A-4E) and few bones ( Figures 5A-5B and 6A-6D) were available for analysis.
- heatmap analysis 160 shows that the model is indeed focusing on regions of bone that are used by these traditional aging methods, including areas where degenerative change is expected, such as the acetabulum and the sacroiliac joints.
- the model is also focusing on diaphyses, which provide continuous stretches of cortical surface.
- MSE (l/n) * Z(y f - f( )) 2 where n is the total number of observations, y t is the response value of the ith observation, and f( j) is the predicted response value of the ith observation. The closer the model predictions are to the observations, the smaller the MSE will be.
- Model 1 trained on individuals 1-135 and was tested on individual 136.
- Model 2 trained on individuals 2-136 and was tested on individual 1.
- Model 3 trained on individuals 1 and 3-136 and was tested on individual 2, and so on. Each iteration provides one point to show how the model performs; we then examined all of these together to assess general performance.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Biomedical Technology (AREA)
- Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Public Health (AREA)
- General Health & Medical Sciences (AREA)
- Theoretical Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Software Systems (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Molecular Biology (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- Primary Health Care (AREA)
- Epidemiology (AREA)
- Mathematical Physics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Computing Systems (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Databases & Information Systems (AREA)
- Radiology & Medical Imaging (AREA)
- Computational Linguistics (AREA)
- Heart & Thoracic Surgery (AREA)
- Veterinary Medicine (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Optics & Photonics (AREA)
- Animal Behavior & Ethology (AREA)
- High Energy & Nuclear Physics (AREA)
- Surgery (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Dentistry (AREA)
- Orthopedic Medicine & Surgery (AREA)
- Physiology (AREA)
- Image Analysis (AREA)
Abstract
La présente divulgation concerne un système qui prédit l'âge biologique de restes de squelette à l'aide d'un réseau neuronal convolutif (entraîné, par exemple, à l'aide de 693 radiographies provenant de 136 adultes enterrés dans des cercueils en plomb aux dix-huitième et dix-neuvième siècles dans la crypte de l'église St. Bride à Londres). De plus, pour augmenter l'explicabilité et réduire au minimum le risque de surajustement, le système selon la présente divulgation utilise une rétropropagation pour générer des cartes thermiques qui localisent des régions pertinentes des restes de squelette qui permettent de discriminer une classe (c.-à-d. ce qui est le plus important pour le modèle pour prédire l'âge au moment du décès).
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US202363588131P | 2023-10-05 | 2023-10-05 | |
| US63/588,131 | 2023-10-05 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025076539A1 true WO2025076539A1 (fr) | 2025-04-10 |
Family
ID=95283850
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/US2024/050257 Pending WO2025076539A1 (fr) | 2023-10-05 | 2024-10-07 | Système activé par réseau neuronal pour estimer l'âge au moment du décès au moyen d'images radiographiques |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2025076539A1 (fr) |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20210015421A1 (en) * | 2019-07-16 | 2021-01-21 | 16 Bit Inc. | Systems and Methods for Approximating Bone Mineral Density and Fracture Risk using Single Energy X-Rays |
| US20210090732A1 (en) * | 2017-04-28 | 2021-03-25 | University Of Southern California | System and method for predicting survival time |
| US20210287805A1 (en) * | 2020-03-11 | 2021-09-16 | National Taiwan University | Systems and methods for prognosis prediction of acute myeloid leukemia patients |
| US20220039965A1 (en) * | 2020-08-06 | 2022-02-10 | Carlsmed, Inc. | Patient-specific artificial discs, implants and associated systems and methods |
| US20220208384A1 (en) * | 2020-12-24 | 2022-06-30 | Industry-Academic Cooperation Foundation, Yonsei University | Method and apparatus of predicting fracture risk |
| WO2023121810A1 (fr) * | 2021-12-22 | 2023-06-29 | Orthofix Us Llc | Détermination de longueur d'implant basée sur une image et systèmes, dispositifs et procédés associés |
-
2024
- 2024-10-07 WO PCT/US2024/050257 patent/WO2025076539A1/fr active Pending
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20210090732A1 (en) * | 2017-04-28 | 2021-03-25 | University Of Southern California | System and method for predicting survival time |
| US20210015421A1 (en) * | 2019-07-16 | 2021-01-21 | 16 Bit Inc. | Systems and Methods for Approximating Bone Mineral Density and Fracture Risk using Single Energy X-Rays |
| US20210287805A1 (en) * | 2020-03-11 | 2021-09-16 | National Taiwan University | Systems and methods for prognosis prediction of acute myeloid leukemia patients |
| US20220039965A1 (en) * | 2020-08-06 | 2022-02-10 | Carlsmed, Inc. | Patient-specific artificial discs, implants and associated systems and methods |
| US20220208384A1 (en) * | 2020-12-24 | 2022-06-30 | Industry-Academic Cooperation Foundation, Yonsei University | Method and apparatus of predicting fracture risk |
| WO2023121810A1 (fr) * | 2021-12-22 | 2023-06-29 | Orthofix Us Llc | Détermination de longueur d'implant basée sur une image et systèmes, dispositifs et procédés associés |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN110390351B (zh) | 一种基于深度学习的致痫灶三维自动定位系统 | |
| JP7241075B2 (ja) | 脊椎骨折を識別するための3次元医用画像解析の方法及びシステム | |
| Wu et al. | Automatic landmark estimation for adolescent idiopathic scoliosis assessment using BoostNet | |
| US20200286614A1 (en) | A system and method for automated labeling and annotating unstructured medical datasets | |
| EP3905129B1 (fr) | Procédé d'identification d'images d'os | |
| Sikkandar et al. | Automatic Detection and Classification of Human Knee Osteoarthritis Using Convolutional Neural Networks. | |
| Sørensen et al. | Texture classification in lung CT using local binary patterns | |
| Patil et al. | Classification and risk estimation of osteoarthritis using deep learning methods | |
| Dhanalakshmi et al. | Convolutional Neural Network Model based Deep Learning Approach for Osteoporosis Fracture Detection | |
| Dodamani et al. | Transfer learning-based osteoporosis classification using simple radiographs | |
| Jani et al. | Charting the growth through intelligence: A SWOC analysis on AI-assisted radiologic bone age estimation | |
| US12374465B2 (en) | System and method for detection of a heart failure risk | |
| CN118014957B (zh) | 一种儿童手腕骨折区域图像检测方法 | |
| WO2025076539A1 (fr) | Système activé par réseau neuronal pour estimer l'âge au moment du décès au moyen d'images radiographiques | |
| Kadu et al. | Advanced Bi-CNN for Detection of Knee Osteoarthritis using Joint Space Narrowing Analysis | |
| Leo et al. | Neural Foraminal Stenosis Classifications using Multi-Feature Hierarchical Clustering and Delineation | |
| CN113781453B (zh) | 一种基于x线片的脊柱侧凸进展预测及方法和装置 | |
| Rao et al. | The osteoporosis disease diagnosis and classification using U-Net deep learning process | |
| Mahato et al. | Uncertainty Quantification in Deep Learning Framework for Mallampati Classification | |
| Ahmed et al. | Prediction of Cardiomegaly Disease Using Deep Learning | |
| Khalid et al. | Automated cobb’s angle measurement for scoliosis diagnosis using deep learning techniques | |
| EP4495877A1 (fr) | Procédé mis en uvre par ordinateur pour déterminer un score de fiabilité | |
| Gadu et al. | SCANet: automated Cobb angle measurement using deep learning model in X-ray images for adolescent idiopathic scoliosis patients | |
| Sundarasamy et al. | Age and gender classification with bone images using deep learning algorithms | |
| Ramos | Analysis of medical images to support decision-making in the musculoskeletal field |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24875597 Country of ref document: EP Kind code of ref document: A1 |