WO2025125991A1 - Techniques d'évaluation de qualité de résultats d'algorithmes - Google Patents
Techniques d'évaluation de qualité de résultats d'algorithmes Download PDFInfo
- Publication number
- WO2025125991A1 WO2025125991A1 PCT/IB2024/062229 IB2024062229W WO2025125991A1 WO 2025125991 A1 WO2025125991 A1 WO 2025125991A1 IB 2024062229 W IB2024062229 W IB 2024062229W WO 2025125991 A1 WO2025125991 A1 WO 2025125991A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- data
- algorithm
- data set
- training
- model
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/60—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
- G16H40/63—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61N—ELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
- A61N1/00—Electrotherapy; Circuits therefor
- A61N1/18—Applying electric currents by contact electrodes
- A61N1/32—Applying electric currents by contact electrodes alternating or intermittent currents
- A61N1/36—Applying electric currents by contact electrodes alternating or intermittent currents for stimulation
- A61N1/36036—Applying electric currents by contact electrodes alternating or intermittent currents for stimulation of the outer, middle or inner ear
- A61N1/36038—Cochlear stimulation
- A61N1/36039—Cochlear stimulation fitting procedures
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
- G06N20/20—Ensemble learning
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
- G06N3/0455—Auto-encoder networks; Encoder-decoder networks
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/0464—Convolutional networks [CNN, ConvNet]
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/047—Probabilistic or stochastic networks
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/0475—Generative networks
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/048—Activation functions
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/084—Backpropagation, e.g. using gradient descent
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/088—Non-supervised learning, e.g. competitive learning
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/09—Supervised learning
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/094—Adversarial learning
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N5/00—Computing arrangements using knowledge-based models
- G06N5/01—Dynamic search techniques; Heuristics; Dynamic trees; Branch-and-bound
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N7/00—Computing arrangements based on specific mathematical models
- G06N7/01—Probabilistic graphical models, e.g. probabilistic networks
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H20/00—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
- G16H20/40—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/40—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the management of medical equipment or devices, e.g. scheduling maintenance or upgrades
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/60—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
- G16H40/67—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/20—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/30—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/70—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for mining of medical data, e.g. analysing previous cases of other patients
Definitions
- the present disclosure relates to systems, methods, and computer readable storage media for generating and utilizing quality assessments of outcomes generated by algorithms.
- Medical devices have provided a wide range of therapeutic benefits to recipients over recent decades.
- Medical devices can include internal or implantable components/devices, external or wearable components/devices, or combinations thereof (e.g., a device having an external component communicating with an implantable component).
- Medical devices such as traditional hearing aids, partially or fully-implantable hearing prostheses (e.g., bone conduction devices, mechanical stimulators, cochlear implants, etc.), pacemakers, defibrillators, functional electrical stimulation devices, and other medical devices, have been successful in performing lifesaving and/or lifestyle enhancement functions and/or recipient monitoring for a number of years.
- a method comprises generating a model that is representative of first data used to train an algorithm, and generating a decision metric for determining if a similarity between second data input to the algorithm and the first data is sufficient for the algorithm to generate an output that is valid based on patterns of the first data identified in the model accounting for the second data.
- a non-transitory computer readable storage medium comprises computer readable instructions stored thereon for causing a computing system to: generate a quality assessment of a relationship between a first data set and a second data set using a model that comprises features of the first data set; and determine a similarity between the first data set and the second data set using a trust estimator that processes the quality assessment to assess a trustworthiness of an outcome that an algorithm generates in response to the second data set.
- the algorithm is developed using the first data set.
- a computer implemented method for estimating a trustworthiness of an output of an algorithm that has been trained using training data comprises: generating a representation of a relationship between the training data and input data using a model that comprises a description of the training data, wherein the algorithm uses the input data to generate the output; and generating a trust value for the output of the algorithm using a decision metric based on the output of the algorithm and based on the representation of the relationship between the training data and the input data.
- Figure IB depicts a functional block diagram of the cochlear implant of Figure 1A.
- Figure 1C is a diagram illustrating an example of an auditory prosthesis that can include one or more embodiments disclosed herein.
- Figure ID is a functional block diagram of an exemplary totally implantable cochlear implant.
- Figure 2A is a diagram that depicts examples of training configurations of a system that can generate a training embedding model and a decision metric during a training stage.
- Figure 2B is a diagram that depicts examples of clinical configurations of a system that can be used to generate a quality assessment of an outcome of an algorithm during a clinical application stage.
- FIG. 1C is a diagram illustrating an example of an auditory prosthesis 150 that can include one or more embodiments disclosed herein.
- the auditory prosthesis 150 of FIG. 1C is an example of a cochlear implant.
- auditory prothesis 150 can be a mostly implantable cochlear implant (MICI) or a totally implantable cochlear implant (TICI).
- MICI implantable cochlear implant
- TICI totally implantable cochlear implant
- Stimulating assembly 118 includes a plurality of longitudinally spaced intra-cochlear electrical stimulating contacts (electrodes) 126 that collectively form a contact or electrode array 128 for delivery of electrical stimulation (current) to the recipient's cochlea 162.
- Stimulating assembly 118 extends through an opening in the recipient's cochlea (e.g., cochleostomy, the round window, etc.) and has a proximal end connected to a stimulator unit in implant body 160 via lead region 116 and a hermetic feedthrough (not shown in FIG. 1C).
- Lead region 116 includes a plurality of conductors (wires) that electrically couple the stimulating contacts 126 to the stimulator unit.
- FIG. ID is a functional block diagram of an exemplary totally implantable cochlear implant 170. Because the cochlear implant 170 is totally implantable, all components of cochlear implant 170 are configured to be implanted under skin/tissue 175 of a recipient. Because all components are implantable, cochlear implant 170 operates, for at least a finite period of time, in an "invisible hearing" mode without the need of an external device.
- An external device 172 can be used to, for example, charge an internal power source (battery) 177.
- External device 172 can be a dedicated charger or a conventional cochlear implant sound processor.
- Cochlear implant 170 includes an implant body (main implantable component) 174, one or more input elements for capturing/receiving input audio signals (e.g., using one or more implantable microphones 178 and a wireless transceiver 181), an implantable coil 182, and an elongated intra-cochlear stimulating assembly as described above.
- the microphone 178 and/or the implantable coil 182 can be positioned in, or electrically connected to, the implant body 174.
- the implant body 174 further comprises the battery 177, RF (radio frequency) interface circuitry 184, a processing module 185, and a stimulator unit 180.
- the processing module 185 can be similar to processing modules described previously, and includes environmental classifier 191, sound processor 193, and individualized own voice detector 195.
- the one or more implantable microphones 178 are configured to receive input audio signals.
- the processing module 185 is configured to convert received input audio signals into stimulation control signals 196 for use in stimulating a first ear of a recipient.
- sound processor 193 is configured to convert the input audio signals into stimulation control signals 196 that represent electrical stimulation for delivery to the recipient.
- the processing module 185 is implanted in the recipient.
- the stimulation control signals 196 do not traverse an RF link, but instead are provided directly to the stimulator unit 180.
- the stimulator unit 180 is configured to utilize the stimulation control signals 196 to generate electrical stimulation signals that are delivered to the recipient's cochlea via one or more stimulation channels that include lead region 116 and stimulating assembly 118 having electrode array 128.
- the environmental classifier 191 is configured to determine an environmental classification of the sound environment associated with the input audio signals and the individualized own voice detector 195 is configured to perform individualized own voice detection (OVD).
- OTD individualized own voice detection
- the stimulating assembly of a cochlear implant can be implanted into the cochlea of a recipient during cochlear implant surgery.
- the electrode array of a cochlear implant can become buckled, folded, or kinked during or after cochlear implant surgery.
- a cochlear implant can be measured after cochlear implant surgery to generate trans-impedance matrices (TIMs).
- TIMs trans-impedance matrix
- a trans-impedance matrix (TIM) measurement is a measurement of the propagation of electric fields inside the cochlea of a recipient.
- a TIM can be used to estimate the placement of an electrode array of a cochlear implant in a recipient or whether the electrode array has been buckled, kinked, or folded.
- Trans- impedance matrices generated from the inner ears of many recipients of cochlear implants can be used to train a machine learning (ML) algorithm.
- the trained ML algorithm can then be used to determine if an electrode array of a cochlear implant that has been implanted in a recipient has buckled, folded, or kinked using TIMs generated from the electrode array.
- the amount of training data containing TIMs generated from electrode arrays implanted in cochlear implant recipients is insufficient to train an ML algorithm to correctly recognize every instance of a buckled, folded, or kinked electrode array.
- a model is generated that describes training data used in the development of an algorithm, such as a machine learning (ML) algorithm or another type of algorithm.
- the model can be used to identify how close clinical input data is to the training data, and thus the model can be used to define the trustworthiness of the output of the algorithm.
- the model allows the algorithm to be developed using reduced training data sets in situations when collecting substantial representative data for a specific output, such as the detection of a rare event in a data set, is infeasible.
- the clinical input data can include TIMs generated from measuring electrode arrays of many cochlear implant recipients, and the ML algorithm can be trained to detect an electrode array of a cochlear implant that has buckled, folded, or kinked.
- the clinical input data can include data generated from measuring implantable medical devices of a particular type, and the ML algorithm can be trained with training data to detect features of measurements obtained using the same type of implantable medical devices.
- the measurements obtained using the implantable medical devices can, as examples, include electrophysiological and/or tissue-related responses to stimulus, such as action potentials and/or impedance measurements.
- the model can be optionally combined with pre-processing, interim algorithm artifacts, or the algorithm output to enhance the trust definition.
- a quality assessment is generated that provides a high quality rating of the output of the algorithm when the algorithm is correct and a low quality rating of the output of the algorithm when the algorithm is incorrect.
- the low quality rating is provided when the algorithm output has a high probability of being incorrect.
- the quality assessment can be a probability rating or a rating that is based on a log-likelihood function. The quality assessment can be used to determine if the output of the algorithm is trustworthy or untrustworthy. When the algorithm is operating on edge cases in the input data that cause the algorithm to generate incorrect assessments, the quality assessment generates an indication that further assessment is recommended.
- the quality assessment can include a training stage and a clinical application stage.
- a model of a training data set is constructed, and a decision metric (e.g., including thresholds and functions) is defined that creates a trust output.
- a method can be performed during the training stage. The method includes generating a training embedding model representative of training data used to develop an algorithm. The method also includes generating a decision metric for determining if a similarity between input data to the algorithm and the training data is sufficient for the algorithm to generate an output that is valid based on patterns described by the training embedding model accounting for the input data.
- the training embedding model and the decision metric are applied to input data (e.g., data including clinical measurement vectors) during the clinical application stage to provide a trust assessment of the quality of an output of the algorithm for evaluating the trustworthiness of the output of the algorithm.
- the algorithm can be treated as a black box algorithm during the training and clinical application stages.
- a method can be performed during the clinical application stage that includes generating an assessment of a relationship between input data to the algorithm and training data used to train the algorithm using the training embedding model.
- the training embedding model includes features of the training data.
- the method also includes determining a similarity between the input data and the training data for evaluating a trustworthiness of the output of the algorithm with the decision metric that processes the assessment.
- two configurations of the quality assessment can be used during the training and clinical application stages.
- a direct training data assessment is performed.
- the algorithm that the quality assessment is paired with is not used during the development of the quality assessment. This quality assessment can determine how similar the input data is to the training data.
- the algorithm output is included as one of the inputs in conjunction with the training data embedding model when the decision metric estimates the reliability of the algorithm output.
- FIG. 2A is a diagram that depicts examples of training configurations of a system that can generate a training embedding model and a decision metric during a training stage.
- the system of FIG. 2A includes an optional augmentation stage 202, an optional preprocessing stage 203, a training embedding model 204, and a decision metric 205.
- FIG. 2A also illustrates a training data set 201, a switch 207, and a trust value 206 generated by the decision metric 206.
- the system of FIG. 2A can be used in multiple different training configurations that are illustrated graphically in FIG. 2A by switch 207.
- the switch 207 can be adjusted to select one of the training configurations. Any one of these training configurations can be used during the training stage.
- the augmentation stage 202 can be optionally used in various training configurations to expand the training data set 201 using one or more data augmentation transformations.
- the switch 207 is adjusted to prevent the training data set 201 from being provided directly to training embedding model 204. Instead, augmentation stage 202 performs data augmentation on the training data set 201 to generate an augmented training data set.
- the augmentation stage 202 is used to extend the training data set 201 used to develop the training embedding model 204 beyond the training data set 201 used in the development of a pre-trained algorithm.
- the switch 207 can be optionally adjusted to provide the augmented training data set to the pre-processing stage 203 or directly to the training embedding model 204 according to various training configurations.
- An example of a data augmentation transformation that can be performed at the augmentation stage 202 involves adding gaussian white noise, with a mean of zero, to measurements in the training data set 201.
- Another example of a data augmentation transformation that can be performed at the augmentation stage 202 involves re-quantizing measurements in the training data set 201 at variable bit depths.
- Yet another example of a data augmentation transformation that can be performed at the augmentation stage 202 involves selective fouling or removal of a part of the measurement data in the training data set 201.
- Yet another example of a data augmentation transformation that can be performed at the augmentation stage 202 involves adding scaling to the measurements to account for gain inaccuracies in the training data set 201.
- the pre-trained target algorithm 213 then processes either the augmented training data set generated by augmentation stage 202 or the transformed training data generated by pre-processing stage 203 to generate an outcome 214.
- the outcome 214 is provided to an input of the decision metric 205 through path 311.
- the decision metric 205 is generated using the training embedding model 204 and the outcome 214.
- the outcome 214 of the algorithm 213 is used to train the decision metric 205 to generate the trust metric 206.
- the algorithm 213 may in some embodiments receive continuous input training data prior to generating categorical outcomes 214. Examples of the decision metric 205 are disclosed above with respect to FIG. 2A.
- Figure 3B is a diagram that depicts examples of clinical configurations of a system that can be used to generate a quality assessment of an outcome of an algorithm during a clinical application stage using the outcome of the algorithm.
- the system of FIG. 3B includes the optional pre-processing stage 203, the training embedding model 204, the decision metric 205, and the pre-trained target algorithm 213.
- the pre-processing stage 203 and the training embedding model 204 function as described above with respect to FIG. 2B.
- FIG. 3B also illustrates a clinical data set 211, switch 212, the trust output 206 generated by the decision metric 205, and the outcome 214 of the algorithm 213.
- switch 212 in the system of FIG. 3B can be adjusted to select one of two clinical configurations. Any one of these clinical configurations can be used during the clinical application stage.
- the pre-trained target algorithm 213 in the system of FIG. 3B generates the outcome 214 based on the clinical data set 211 and/or the transformed data generated by the pre-processing stage 203, depending on the state of the switch 212.
- FIG. 4A is a diagram that depicts an example of a training configuration of a system that can generate a principal components analysis transform and residuals for evaluating the trustworthiness of an outcome of an algorithm during a training stage.
- the system of FIG. 4A includes an extract principal components stage 402, a comparison 403 of the explained variance, a principal component weights stage 404, a principal components analysis (PCA) transform stage 405, an inverse transform stage 407, a residuals stage 406, and a trust estimator 445.
- FIG. 4A also illustrates a training data set 401 and a target algorithm 410.
- the target algorithm 410 can be a machine learning (ML) algorithm that has been pre-trained with the training data set 401 or any other type of algorithm.
- ML machine learning
- the training data set 401 is provided to the target algorithm 410.
- the target algorithm 410 generates outcomes using the training data set 401. Examples of the outcomes of the algorithm 410 are illustrated as positives and negatives in graph 420 shown in Figure 4B. These examples are not intended to be limiting.
- the algorithm 410 may generate the positives in graph 420 to indicate electrode arrays of the cochlear implants that have buckled or folded based on the corresponding TIMs in the training data set 401, and the algorithm 410 may generate the negatives in graph 420 to indicate electrode arrays of the cochlear implants that have not buckled or folded based on the corresponding TIMs in the training data set 401.
- the training data set 401 is also provided to the PCA transform stage 405, to the residuals stage 406, and to the extract principal components stage 402.
- the extract principal components stage 402 extracts the principal components from the training data set 401 using a principal components analysis (PCA) algorithm.
- PCA principal components analysis
- the principal components extracted from the training data set 401 at stage 402 correspond to the vectors in the training data set 401 that explain the most variance in the training data set 401.
- the principal components extracted from the training data set 401 are then compared at comparison 403 to determine if the explained variance is greater than a percentage k%. If the explained variance is not greater than k% at comparison 403, then additional principal components are extracted from the training data set 401 at extract principal components stage 402 to increase the number of extracted principal components, and the additional principal components extracted at stage 402 are compared again at comparison 403.
- principal component weights are generated at principal component weights stage 404 for the principal components extracted from the training data set 401 at stage 402.
- the principal component weights generated at stage 404 are a description of the vectors that explain most of the variance in the training data set 401.
- the principal component weights generated at stage 404 are then provided to the PCA transform stage 405.
- the PCA transform stage 405 then multiplies each of the principal component weights received from stage 404 by a corresponding value in the training data set 401 and then adds the results of these multiplications together to generate a reconstruction.
- the PCA transform stage 405 can multiply each of the principal components by a gain for a corresponding TIM in the training data set 401 to generate a result, and then add all of the results together to generate the reconstruction.
- a trust estimator 445 then generates trust values 446 for the outcomes of the target algorithm 410 based on the values of the residuals generated by residuals stage 406 and based on the outcomes of the target algorithm 410.
- the trust values 446 indicate the trustworthiness of the corresponding outcomes of the target algorithm 410.
- the trust estimator 445 uses the outcomes of the algorithm 410 and the residuals to generate trust values 446 that indicate the reliability of the outcomes of the algorithm 410.
- the trust values 446 may indicate which outcomes of the algorithm 410 deviate substantially from expected outcomes and which outcomes of the algorithm 410 correlate to the expected outcomes.
- the trust estimator 445 functions as the decision metric 205 in this embodiment.
- the graph 420 shown in Figure 4B also illustrates examples of the trust values 446.
- the trust values 446 can be true or false values. The true values are shown in the left half of graph 420, and the false values are shown in the right half of graph 420. In the example of FIG. 4B, the false values are most likely to be generated in the middle range of the residuals around 0.4.
- true negatives in graph 420 indicate that the corresponding negative outcomes of the target algorithm 410 are likely to be correct.
- true positives in graph 420 indicate that the corresponding positive outcomes of the target algorithm 410 are likely to be correct.
- false positives in graph 420 indicate that the corresponding positive outcomes of the target algorithm 410 are likely to be incorrect.
- Figure 4C illustrates graphs 421-422 of probability curves that are examples of the trust values 446 generated by the trust estimator 445.
- Graph 421 illustrates examples of the probabilities of correctness of positive outcomes generated by the algorithm 410 (i.e., trust of positive outcomes) based on the residuals.
- Graph 422 illustrates examples of the probabilities of correctness of negative outcomes generated by the algorithm 410 (i.e., trust of negative outcomes) based on the residuals.
- PCA transform stage 405 multiplies each of the principal component weights by a corresponding value in clinical data set 450 and then adds the results of these multiplications together to generate a reconstruction.
- the reconstruction generated by PCA transform stage 405 is provided to the inverse transform stage 407.
- the inverse transform stage 407 then inverts the reconstruction generated by PCA transform stage 405 to generate an inverted reconstruction, which is provided to residuals stage 406.
- Residuals stage 406 subtracts the inverted reconstruction from the corresponding values in the clinical data set 450 to generate residuals.
- the residuals may, for example, correspond to the differences between the observed values and the estimated values in clinical data set 450.
- the outcome 451 of the algorithm 410 and the residuals generated at residuals stage 406 are then passed to a trust estimator 445.
- the trust estimator 445 compares the probability curves for the trust values generated in the training configuration of FIG. 4A to the residuals generated at residual stage 406 in the clinical configuration of FIG. 4D for each outcome 451 to generate trust values 452 for the residuals.
- Each of the trust values 452 for the residuals indicates the trustworthiness of a corresponding outcome 451 of the target algorithm 410.
- the trust values 452 can be, for example, true or false values.
- the trust values 452 indicate whether the clinical data set 450 is sufficiently similar to the training data set 401 for the target algorithm 410 to produce a valid outcome 451.
- the graphs 421- 422 of the probability curves for the trust values generated during the training stage of FIG. 4A and shown in FIG. 4C illustrate examples that can be used by the trust estimator 445.
- FIG. 5B is a diagram that depicts an example of a clinical configuration of a system that can be used to generate a quality assessment of an outcome of a target algorithm using a K nearest neighbors algorithm during a clinical application stage.
- the system of FIG. 5B includes the K nearest neighbors stage 503, the errors and correlation stage 502, and trust estimator 545.
- FIG. 5B also illustrates a clinical data set 521, training data set 522, the target algorithm 531, the outcome 532 of the target algorithm 531, a trust value 523, and graphs 521-522.
- a trust estimator 545 compares the PDF of the errors and correlations for the items in the clinical data set 521 with the errors and correlations in the training data set 522 to generate trust values 523 that indicate trustworthiness of the outcome 532 of the algorithm
- the trust estimator 616 assigns a trust value 617 to the pre-processed clinical data features received from stage 603 based on the features in the model 604 generated during the training stage of FIG. 6A.
- the trust estimator 616 assigns a trust value 617 to the pre- processed clinical data features that indicates an estimate of the probability of occurrence of that set of pre-processed clinical data features based on the probability mass functions, distributions, covariances, and/or joint probability mass functions of the features in the training data sets 601-602 in the model 604.
- the trust value 617 for the pre- processed clinical data features indicates how similar these features are to the features from the training data sets that are represented in the model 604.
- the generator loss 705 indicates the error in the output of the generative network
- the generative network 706 uses the generator loss 705 in backpropagation to adjust the weights in the nodes in its neural network in each iteration of the training stage of FIG. 7A to reduce the error.
- the discriminator loss 704 indicates the error in the output of the discriminator network 703.
- the discriminator network 703 uses the discriminator loss 704 in backpropagation to adjust the weights in the nodes in its neural network in each iteration of the training stage of FIG. 7A to reduce the error. After enough iterations of the training stage, the training stage is complete, and the discriminator network 703 can be used in a clinical application, as disclosed herein, for example, with respect to FIG. 7B.
- FIG. 7B is a diagram that depicts an example of a clinical configuration of a system that can be used to generate a quality assessment of an outcome of a target algorithm using a discriminator network during a clinical application stage.
- the system of FIG. 7B includes the discriminator network 703 and a target algorithm 724.
- FIG. 7B also illustrates clinical data set 721, the outcome 725 of the target algorithm 724, and one or more trust values 723.
- the target algorithm 724 can be a pre-trained ML algorithm or any other type of algorithm.
- the target algorithm 724 processes the clinical data set 721 to generate the outcome 725.
- the discriminator network 703 is used as the training embedding model 204 and the decision metric 205 after the discriminator network 703 has been trained by the training stage of FIG. 7A.
- the clinical data set 721 is provided to the discriminator network 703. If the discriminator network 703 has been successfully trained by the training stage of FIG. 7A, then the discriminator network 703 can be utilized alongside target algorithm 724 to generate a trust value 723 that is an estimate of how likely the algorithm 724 has previously encountered data that is similar to the clinical data set 721.
- the output of the discriminator network 703 can be provided as an input to the target algorithm 724.
- target algorithm 724 can be trained to generate outcomes 725 that indicate whether TIMs generated from electrode arrays in cochlear implants are indicative of buckling or folding.
- the discriminator network 703 is trained using TIMs generated from electrode arrays of cochlear implants during the training stage of FIG. 7A.
- the discriminator network 703 generates a trust value 723 that indicates whether the outcomes 725 regarding the TIMs are trustworthy based on whether the TIMs in the clinical data set 721 are similar enough to the TIMs in the training data set.
- Figure 8 is a diagram that illustrates an example of a computing system 800 within which one or more of the disclosed embodiments can be implemented. For example,
- T1 computing system 800 can be used to generate and provide control signals to one or more of the medical devices or systems disclosed herein with respect to FIGS. 1A-7B.
- Computing systems, environments, or configurations that can be suitable for use with examples described herein include, but are not limited to, personal computers, server computers, hand-held devices, laptop devices, multiprocessor systems, microprocessorbased systems, programmable consumer electronics (e.g., smart phones), network computers, minicomputers, mainframe computers, tablets, distributed computing environments that include any of the above systems or devices, and the like.
- the computing system 800 can be a single virtual or physical device operating in a networked environment over communication links to one or more remote devices.
- the remote device can be an auditory prosthesis (e.g., the device or system of any one of FIGS. 1A-1D), a personal computer, a server, a router, a network personal computer, a peer device, or other common network node.
- Computing system 800 includes at least one processing unit 802 and memory 804.
- the processing unit 802 includes one or more hardware or software processors (e.g., Central Processing Units) that can obtain and execute instructions.
- the processing unit 802 can communicate with and control the performance of other components of the computing system 800.
- the memory 804 is one or more software-based or hardware-based computer- readable storage media operable to store information accessible by the processing unit 802.
- the memory 804 can store instructions executable by the processing unit 802 to implement applications or cause performance of operations described herein, as well as store other data.
- the memory 804 can be volatile memory (e.g., random access memory or RAM), non-volatile memory (e.g., read-only memory or ROM), or combinations thereof.
- the memory 804 can include transitory memory or non-transitory memory.
- the memory 804 can also include one or more removable or non-removable storage devices.
- the memory 804 can include non-transitory computer readable storage media, such as RAM, ROM, EEPROM (Electronically-Erasable Programmable Read-Only Memory), flash memory, optical disc storage, magnetic storage, solid state storage, or any other memory media usable to store information for later access.
- the memory 804 encompasses a modulated data signal (e.g., a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal), such as a carrier wave or other transport mechanism and includes any information delivery media.
- a modulated data signal e.g., a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal
- the memory 804 can include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, radio-frequency, infrared and other wireless media or combinations thereof.
- the system 800 further includes a network adapter 806, one or more input devices 808, and one or more output devices 810.
- the system 800 can include other components, such as a system bus, component interfaces, a graphics system, a power source (e.g., a battery), among other components.
- the network adapter 806 is a component of the computing system 800 that provides network access to network 812.
- the network adapter 806 can provide wired or wireless network access and can support one or more of a variety of communication technologies and protocols, such as Ethernet, cellular, Bluetooth, near-field communication, and RF (radio frequency), among others.
- the network adapter 806 can include one or more antennas and associated components configured for wireless communication according to one or more wireless communication technologies and protocols.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Biomedical Technology (AREA)
- General Health & Medical Sciences (AREA)
- Data Mining & Analysis (AREA)
- Software Systems (AREA)
- General Physics & Mathematics (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Computation (AREA)
- Computing Systems (AREA)
- General Engineering & Computer Science (AREA)
- Mathematical Physics (AREA)
- Medical Informatics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Computational Linguistics (AREA)
- Molecular Biology (AREA)
- Biophysics (AREA)
- Public Health (AREA)
- Primary Health Care (AREA)
- Epidemiology (AREA)
- General Business, Economics & Management (AREA)
- Business, Economics & Management (AREA)
- Databases & Information Systems (AREA)
- Pathology (AREA)
- Probability & Statistics with Applications (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Otolaryngology (AREA)
- Mathematical Analysis (AREA)
- Mathematical Optimization (AREA)
- Pure & Applied Mathematics (AREA)
- Computational Mathematics (AREA)
- Algebra (AREA)
- Radiology & Medical Imaging (AREA)
- Animal Behavior & Ethology (AREA)
- Veterinary Medicine (AREA)
- Surgery (AREA)
- Urology & Nephrology (AREA)
Abstract
L'invention concerne un procédé comprenant la génération d'un modèle qui est représentatif de premières données utilisées pour entraîner un algorithme et la génération d'une métrique de décision pour déterminer si une similarité entre des secondes données entrées dans l'algorithme et les premières données est suffisante pour que l'algorithme génère une sortie qui est valide sur la base de motifs des premières données identifiées dans le modèle tenant compte des secondes données.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US202363610202P | 2023-12-14 | 2023-12-14 | |
| US63/610,202 | 2023-12-14 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025125991A1 true WO2025125991A1 (fr) | 2025-06-19 |
Family
ID=96056605
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/IB2024/062229 Pending WO2025125991A1 (fr) | 2023-12-14 | 2024-12-04 | Techniques d'évaluation de qualité de résultats d'algorithmes |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2025125991A1 (fr) |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP3786966A1 (fr) * | 2019-08-26 | 2021-03-03 | F. Hoffmann-La Roche AG | Validation automatisée de données médicales |
| CN115295134B (zh) * | 2022-09-30 | 2023-03-24 | 北方健康医疗大数据科技有限公司 | 医学模型评价方法、装置和电子设备 |
| US20230177059A1 (en) * | 2020-02-12 | 2023-06-08 | American Express Travel Related Services Company, Inc. | Computer-based systems for data entity matching detection based on latent similarities in large datasets and methods of use thereof |
| WO2023122229A2 (fr) * | 2021-12-24 | 2023-06-29 | BeeKeeperAI, Inc. | Systèmes et procédés de validation et de transformation de données, d'obscurcissement de données, de validation d'algorithme et de fusion de données en environnement zéro confiance |
| US20230215531A1 (en) * | 2020-06-16 | 2023-07-06 | Nuvasive, Inc. | Intelligent assessment and analysis of medical patients |
-
2024
- 2024-12-04 WO PCT/IB2024/062229 patent/WO2025125991A1/fr active Pending
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP3786966A1 (fr) * | 2019-08-26 | 2021-03-03 | F. Hoffmann-La Roche AG | Validation automatisée de données médicales |
| US20230177059A1 (en) * | 2020-02-12 | 2023-06-08 | American Express Travel Related Services Company, Inc. | Computer-based systems for data entity matching detection based on latent similarities in large datasets and methods of use thereof |
| US20230215531A1 (en) * | 2020-06-16 | 2023-07-06 | Nuvasive, Inc. | Intelligent assessment and analysis of medical patients |
| WO2023122229A2 (fr) * | 2021-12-24 | 2023-06-29 | BeeKeeperAI, Inc. | Systèmes et procédés de validation et de transformation de données, d'obscurcissement de données, de validation d'algorithme et de fusion de données en environnement zéro confiance |
| CN115295134B (zh) * | 2022-09-30 | 2023-03-24 | 北方健康医疗大数据科技有限公司 | 医学模型评价方法、装置和电子设备 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| EP3076866B1 (fr) | Détection de potentiels d'action neuronale à l'aide d'une représentation de signal rare | |
| US20240323625A1 (en) | Objective determination of acoustic prescriptions | |
| EP3082948B1 (fr) | Détection de potentiel d'action neuronal au moyen d'un modèle de potentiel d'action composite convolutif | |
| US12081946B2 (en) | Individualized own voice detection in a hearing prosthesis | |
| EP3104931B1 (fr) | Détermination d'une amplitude de potentiel d'action neuronale basée sur une géométrie différentielle multidimensionnelle | |
| WO2025125991A1 (fr) | Techniques d'évaluation de qualité de résultats d'algorithmes | |
| WO2017203486A1 (fr) | Sélection d'électrodes | |
| WO2024003688A1 (fr) | Entraînement de capteur implantable | |
| WO2020225732A1 (fr) | Techniques d'élimination d'artefacts de stimulation | |
| WO2015136429A1 (fr) | Adaptation et modélisation d'excitation | |
| US20230364421A1 (en) | Parameter optimization based on different degrees of focusing | |
| US20240335661A1 (en) | Phase coherence-based analysis of biological responses | |
| US20250381400A1 (en) | Implantable sensor training | |
| WO2024095098A1 (fr) | Systèmes et procédés d'indication de réponses neuronales | |
| WO2024213976A2 (fr) | Commande de stimulation | |
| WO2025062267A1 (fr) | Systèmes et procédés d'estimation d'épaisseur de lambeau de peau | |
| WO2025094136A1 (fr) | Nouvelles techniques de traitement | |
| WO2025094006A1 (fr) | Régulation de réduction de bruit | |
| WO2024246666A1 (fr) | Classification basée sur l'électrocochléographie | |
| WO2025093996A1 (fr) | Perception améliorée de signaux cibles | |
| WO2024134329A1 (fr) | Systèmes et procédés à liaisons de communication de secours | |
| WO2023223137A1 (fr) | Stimulation personnalisée basée sur la santé neurale |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24903063 Country of ref document: EP Kind code of ref document: A1 |