CN119399202A - Wound assessment monitoring method and device - Google Patents
Wound assessment monitoring method and device Download PDFInfo
- Publication number
- CN119399202A CN119399202A CN202510002108.2A CN202510002108A CN119399202A CN 119399202 A CN119399202 A CN 119399202A CN 202510002108 A CN202510002108 A CN 202510002108A CN 119399202 A CN119399202 A CN 119399202A
- Authority
- CN
- China
- Prior art keywords
- wound
- different
- image features
- similarity
- determining
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
- G06T7/0014—Biomedical image inspection using an image reference approach
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/107—Measuring physical dimensions, e.g. size of the entire body or parts thereof
- A61B5/1075—Measuring physical dimensions, e.g. size of the entire body or parts thereof for measuring dimensions by non-invasive methods, e.g. for determining thickness of tissue layer
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/107—Measuring physical dimensions, e.g. size of the entire body or parts thereof
- A61B5/1077—Measuring of profiles
- A61B5/1078—Measuring of profiles by moulding
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/44—Detecting, measuring or recording for evaluating the integumentary system, e.g. skin, hair or nails
- A61B5/441—Skin evaluation, e.g. for skin disorder diagnosis
- A61B5/445—Evaluating skin irritation or skin trauma, e.g. rash, eczema, wound, bed sore
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7264—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7264—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
- A61B5/7267—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/46—Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/54—Extraction of image or video features relating to texture
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/56—Extraction of image or video features relating to colour
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/74—Image or video pattern matching; Proximity measures in feature spaces
- G06V10/75—Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
- G06V10/757—Matching configurations of points or features
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/74—Image or video pattern matching; Proximity measures in feature spaces
- G06V10/761—Proximity, similarity or dissimilarity measures
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/764—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02A—TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
- Y02A90/00—Technologies having an indirect contribution to adaptation to climate change
- Y02A90/10—Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Theoretical Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Medical Informatics (AREA)
- General Physics & Mathematics (AREA)
- Artificial Intelligence (AREA)
- Multimedia (AREA)
- Evolutionary Computation (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Molecular Biology (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Heart & Thoracic Surgery (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Software Systems (AREA)
- Databases & Information Systems (AREA)
- Computing Systems (AREA)
- Signal Processing (AREA)
- Psychiatry (AREA)
- Physiology (AREA)
- Mathematical Physics (AREA)
- Fuzzy Systems (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Dentistry (AREA)
- Dermatology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Quality & Reliability (AREA)
- Image Analysis (AREA)
Description
Technical Field
The invention belongs to the technical field of medical equipment, and particularly relates to a method and equipment for evaluating and monitoring wounds.
Background
In order to realize evaluation and monitoring in the wound healing process, in the invention patent application CN202410474930.4, namely a method and a system for realizing wound healing monitoring, a three-dimensional model is reproduced through a point cloud data set to obtain three-dimensional data of a wound, three areas are divided to obtain the type of the wound, a recovery function is established, and a prediction evaluation parameter is generated, so that the condition of the wound can be more comprehensively monitored, but the following technical problems exist:
When the evaluation and monitoring of the wound are carried out, although the accurate evaluation of the recovery condition of the wound can be accurately realized through the construction of the three-dimensional model of the wound, the complexity of the evaluation and monitoring of the wound is inevitably high, so that if a differentiated evaluation and monitoring method cannot be generated according to different wound types, the efficiency of the evaluation and processing of the wound cannot be improved on the basis of ensuring the evaluation accuracy of the wound.
Aiming at the technical problems, the invention provides a method and equipment for evaluating and monitoring wounds.
Disclosure of Invention
In order to achieve the purpose of the invention, the invention adopts the following technical scheme:
according to one aspect of the invention, a wound assessment monitoring method and apparatus are provided.
A method for assessing and monitoring a wound, which specifically comprises the following steps:
Acquiring wound image features of different parts based on wound images of a wound, and taking a patient corresponding to a matched wound type as a history matched patient when the matched wound type exists in the wound by utilizing an analysis result of the wound image features;
When the similar conditions are used for determining that the distinguishing accuracy between different development stages meets the requirements, the identification accurate image characteristics in the different development stages are determined based on the wound image characteristics of the history matching patients in the different development stages of the wound;
And acquiring similar conditions of the identification accurate image features and other wound types in different development stages, and determining an evaluation monitoring method of the wound by combining the distinguishing accuracy between the different development stages.
The invention has the beneficial effects that:
Based on the historical matching of the wound image characteristics of the patient in different development stages of the wound, the identification accurate image characteristics in different development stages are determined, so that the identification of the identification accurate image characteristics is realized under the similar condition between the wound image characteristics in different development stages and other development stages, the basis is laid for the determination of the evaluation monitoring method of the wound according to the identification accurate image characteristics, and the accuracy of the identification processing of the wound of the patient is also improved.
According to the method for evaluating and monitoring the wounds according to the deviation situations of the identification accurate image features and other wound types in different development stages and the distinguishing accuracy between different development stages, the deviation situations of the identification accurate image features and other wound types are considered, the identification of other wound types due to the adoption of the identification accurate image features is avoided, meanwhile, the distinguishing accuracy between the development stages is considered, the evaluation of the accuracy of the image identification of the wound types is carried out in multiple aspects, and a foundation is laid for further adopting point cloud data or the image identification mode to evaluate and monitor the wounds.
Further technical proposal is that the wound image features comprise color features, texture features, shape features and spatial relationship features.
A further technical solution is to determine that a matching wound type exists for a wound, comprising:
Determining the similarity degree of the wound image features of different parts and different wound types according to the analysis result of the wound image features;
Determining similar image features in the wound image features of different parts according to the similarity degree with different wound types;
determining whether the wound has a matching wound type by the number of similar image features to a different wound type.
The further technical scheme is that the similarity image features are determined according to the similarity degree of the matched image features corresponding to the wound types, and the specific image features with the similarity degree larger than the preset similarity coefficient are the similarity image features.
A further technical solution is to determine whether a matching wound type exists for the wound by the number of similar image features to different wound types, comprising:
When there are wound types with a number of similar image features greater than the preset number of features, the wound type with the largest number of similar image features is the matching wound type.
The further technical scheme is that the method for determining the evaluation monitoring method of the wound comprises the following steps:
according to the similarity conditions of the identification accurate image features in different development stages and other wound types, determining similarity coefficients of the identification accurate image features in different development stages and other wound types, determining identification deviation probabilities of different development stages by using the similarity coefficients, and determining comprehensive identification deviation probabilities of the wound based on the average value of the identification deviation probabilities of different development stages;
Determining the comprehensive distinguishing accuracy of the wound through distinguishing accuracy among different development stages;
And determining the identification reliability coefficient of the wound based on the ratio of the comprehensive distinguishing accuracy of the wound to the comprehensive identification deviation probability, and determining the evaluation monitoring method of the wound according to the identification reliability coefficient.
The further technical scheme is that the evaluation monitoring method for determining the wound according to the identification reliability coefficient specifically comprises the following steps:
when the identification reliability coefficient is smaller than a preset reliability coefficient threshold value, performing evaluation monitoring on the wound by utilizing the point cloud data;
And when the identification reliability coefficient is not smaller than a preset reliability coefficient threshold value, performing evaluation monitoring on the wound by using an image identification mode.
In a second aspect, the invention provides a computer device comprising a memory and a processor in communication, and a computer program stored on the memory and capable of running on the processor, the processor executing a wound assessment monitoring method as described above when running the computer program.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be apparent from the description, or may be learned by practice of the invention as set forth hereinafter.
In order to make the above objects, features and advantages of the present invention more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
The above and other features and advantages of the present invention will become more apparent by describing in detail exemplary embodiments thereof with reference to the attached drawings.
FIG. 1 is a flow chart of a method of wound assessment monitoring;
FIG. 2 is a flow chart of a method of determining that a wound exists that matches a wound type;
FIG. 3 is a flow chart of a method of determining discrimination accuracy;
FIG. 4 is a flow chart of a method of identifying a determination of accurate image features;
Fig. 5 is a flow chart of a method of determining an assessment monitoring method of a wound.
Detailed Description
In order to make the technical solutions in the present specification better understood by those skilled in the art, the technical solutions in the embodiments of the present specification will be clearly and completely described below with reference to the drawings in the embodiments of the present specification, and it is obvious that the described embodiments are only some embodiments of the present specification, not all embodiments. All other embodiments, which can be made by one of ordinary skill in the art based on the embodiments herein without making any inventive effort, shall fall within the scope of the present disclosure.
When the evaluation and monitoring of the wound are carried out, although the accurate evaluation of the recovery condition of the wound can be accurately realized through the construction of the three-dimensional model of the wound, the evaluation and monitoring of the wound is inevitably high in complexity.
And determining by using the similarity degree of the wound image features and preset image features of different wound types, specifically determining a matching coefficient by using the number ratio of the wound image features with the similarity degree larger than 0.7, and taking the wound type with the largest matching coefficient as the matching wound type.
And determining the similarity of the wound image characteristics among different development stages of the wound according to the similarity of the wound image characteristics of different history matching patients in different development stages of the wound, determining the history matching patients with the similarity of the wound image characteristics among different development stages being greater than a preset similarity degree, taking the history matching patients as the similarity matching patients, determining the image characteristic similarity coefficients among different development stages according to the number of the similarity matching patients among different development stages, and determining the distinguishing accuracy among the development stages according to the image characteristic similarity coefficients.
When the distinguishing accuracy between different development stages is more than 0.7, determining that the distinguishing accuracy between different development stages meets the requirement.
And taking the wound image characteristics with deviation values larger than more than 0.7 in different development stages as identification accurate image characteristics.
The evaluation monitoring of the wound is performed by using the point cloud data or by using an image recognition mode.
According to the similarity condition of the identification accurate image features in different development stages and other wound types, determining the similarity coefficients of the identification accurate image features in different development stages and other wound types, determining the identification deviation probabilities of different development stages by using the similarity coefficients, determining the comprehensive identification deviation probability of the wound based on the average value of the identification deviation probabilities of different development stages, determining the comprehensive distinguishing accuracy of the wound by the distinguishing accuracy between different development stages, determining the identification reliable coefficient of the wound based on the ratio of the comprehensive distinguishing accuracy of the wound to the comprehensive identification deviation probability, and determining the evaluation monitoring method of the wound according to the identification reliable coefficient.
And when the identification reliable coefficient is not smaller than the preset reliable coefficient threshold, performing evaluation and monitoring of the wound by using an image identification mode.
Embodiment 1 to solve the above problems, according to an aspect of the present invention, as shown in fig. 1, there is provided a wound assessment and monitoring method, specifically including:
Acquiring wound image features of different parts based on wound images of a wound, and taking a patient corresponding to a matched wound type as a history matched patient when the matched wound type exists in the wound by utilizing an analysis result of the wound image features;
When the similar conditions are used for determining that the distinguishing accuracy between different development stages meets the requirements, the identification accurate image characteristics in the different development stages are determined based on the wound image characteristics of the history matching patients in the different development stages of the wound;
And acquiring similar conditions of the identification accurate image features and other wound types in different development stages, and determining an evaluation monitoring method of the wound by combining the distinguishing accuracy between the different development stages.
Further, the wound image features include color features, texture features, shape features, and spatial relationship features.
It should be noted that, as shown in fig. 2, determining that there is a wound matching the wound type specifically includes:
Determining the similarity degree of the wound image features of different parts and different wound types according to the analysis result of the wound image features;
Determining similar image features in the wound image features of different parts according to the similarity degree with different wound types;
determining whether the wound has a matching wound type by the number of similar image features to a different wound type.
Further, the similarity image features are determined according to the similarity degree of the matching image features corresponding to the wound types, and the image features with the similarity degree larger than the preset similarity coefficient are specifically the similarity image features.
Optionally, determining whether the wound has a matching wound type by the number of similar image features to a different wound type specifically includes:
When there are wound types with a number of similar image features greater than the preset number of features, the wound type with the largest number of similar image features is the matching wound type.
In a further embodiment, determining that a wound exists that matches a wound type, specifically includes:
Determining the similarity degree of the wound image features of different parts and different wound types according to the analysis result of the wound image features;
Determining a wound similarity coefficient with different wound types according to the similarity degree of the wound image characteristics of different parts and the different wound types;
determining whether a matching wound type exists for the wound by a wound similarity coefficient to a different wound type.
Optionally, determining that the wound exists matching the wound type specifically includes:
determining the similarity degree of the wound image features of different parts and different wound types according to the analysis result of the wound image features, and determining that the wound does not have a matched wound type when the wound type with the similarity degree meeting the requirement does not exist;
When there is a wound type for which the degree of similarity meets the requirements:
Taking the wound types with the similar degree meeting the requirements as alternative wound types, taking the wound image features with the similar degree meeting the requirements as similar image features, and determining that the wound does not have the matched wound types when the number of the similar image features of different alternative wound types is smaller than the preset feature number;
when there are alternative wound types with the number of similar image features not less than the preset feature number:
As a screened wound type, an alternative wound type with the number of similar image features not smaller than the number of preset features, and when only one screened wound type exists, determining that the wound exists to be matched with the wound type;
when there are multiple screened wound types:
According to the number of similar image features of different screened wound types, when the number deviation of the number of similar image features of different screened wound types and the number deviation of the screened wound types with the largest number of similar image features is larger than a preset deviation number threshold value, determining that the wound has the matched wound type;
When the deviation of the number of similar image features of different screened wound types from the number of screened wound types with the largest number of similar image features is larger than a preset deviation number threshold value:
And determining a wound similarity coefficient with different wound types according to the similarity degree of the wound image characteristics of different parts and the different wound types, and determining whether the wound has a matched wound type according to the wound similarity coefficient with the different wound types.
Specifically, the development stage comprises a coagulation stage, an inflammation stage, a repair stage and a maturation stage.
Specifically, as shown in fig. 3, the method for determining the distinguishing accuracy rate includes:
The method comprises the steps of determining history matching patients with the similar conditions of the wound image characteristics of the patients in different development stages of the wound according to different histories, wherein the similar conditions of the wound image characteristics in different development stages are larger than a preset similarity degree, and taking the history matching patients as the similarity matching patients;
determining image feature similarity coefficients between different development stages according to the number of similarity-matched patients between different development stages;
and determining the distinguishing accuracy between the development stages by using the image characteristic similarity coefficient.
Further, the value range of the distinguishing accuracy is between 0 and 1, wherein when the distinguishing accuracy is smaller than the preset accuracy, it is determined that the distinguishing accuracy between the development stages does not meet the requirement.
Optionally, when the accuracy of distinguishing between different development stages does not meet the requirement, performing evaluation and monitoring of the wound by means of point cloud data acquisition.
Optionally, the method for determining the distinguishing accuracy rate includes:
The method comprises the steps of determining history matching patients among different development stages of a wound according to similar conditions of wound image characteristics of different history matching patients among the development stages of the wound, and taking the history matching patients as similar matching patients;
and determining the distinguishing accuracy rate by using the number proportion of the similar matched patients.
Further, the similar development stage is a development stage in which the similar conditions of the wound image features between each other do not meet the requirements.
In another embodiment, the method for determining the distinguishing accuracy rate is as follows:
S11, matching the similar conditions of the wound image characteristics of the patient in different development stages of the wound according to different histories, determining the histories of which the similar conditions of the wound image characteristics between the development stages are larger than a preset similar degree, and taking the histories as similar matched patients;
Optionally, the step S11 includes the following:
S111, according to the similar conditions of the wound image characteristics of different history matching patients in different development stages of the wound, determining that no history matching patients with the similar conditions of the wound image characteristics greater than a preset similar degree exist between the development stages, determining that the distinguishing accuracy meets the requirement, determining the distinguishing accuracy by using the average value of the similar conditions of the wound image characteristics of different history matching patients, and when the development stages of the history matching patients with the similar conditions of the wound image characteristics greater than the preset similar degree exist, taking the history matching patients as the similar matching patients, and turning to step S112;
S112, when the number of the similar matched patients among the development stages is determined to be larger than the number of the preset matched patients, determining that the accuracy of distinguishing among the development stages does not meet the requirement, and when the number of the similar matched patients is not larger than the number of the preset matched patients, turning to step S113;
S113, determining similar wound image characteristics of different similar matched patients according to the similar conditions of the wound image characteristics of the different similar matched patients between the development stages, determining that the distinguishing accuracy between the development stages does not meet the requirement when the number of similar matched patients with the similar wound image characteristics not meeting the requirement is larger than the preset number of similar patients, and switching to the step S12 when the number of similar matched patients with the similar wound image characteristics not meeting the requirement is not larger than the preset number of similar patients.
S12, determining patient characteristic similarity coefficients of different similar matched patients between the development stages according to the similarity conditions of the wound image characteristics of the different similar matched patients between the development stages;
optionally, the step S12 includes the following:
S121, determining patient characteristic similarity coefficients of different similar matched patients between the development stages according to the similarity conditions of the wound image characteristics of the different similar matched patients between the development stages, when the similar matched patients with the patient characteristic similarity coefficients larger than a preset similarity coefficient threshold value exist, turning to step S122, and when the similar matched patients with the patient characteristic similarity coefficients larger than the preset similarity coefficient threshold value do not exist, turning to step S123;
S122, taking similar matched patients with the characteristic similarity coefficient of the patient being larger than a preset similarity coefficient threshold value as screening similar patients, when the number of the screening similar patients is larger than a preset number threshold value, determining that the accuracy of distinguishing between the development stages does not meet the requirement, and when the number of the screening similar patients is not larger than the preset number threshold value, turning to step S123;
S123, when the number of the similar matched patients is within a preset similar patient number interval, turning to step S124, and when the number of the similar matched patients is not within the preset similar patient number interval, turning to step S125;
s124, when the average value of the patient characteristic similarity coefficients of different similar matched patients does not meet the requirement, determining that the distinguishing accuracy between the development stages does not meet the requirement, and when the average value of the patient characteristic similarity coefficients of different similar matched patients meets the requirement, turning to a step S125;
S125, determining screening similarity coefficients according to the number of the screening similarity patients and the patient characteristic similarity coefficients of different screening similarity patients, determining that the distinguishing accuracy between the development stages does not meet the requirements when the screening similarity coefficients do not meet the requirements, and switching to the step S13 when the screening similarity coefficients meet the requirements.
S13, determining the distinguishing accuracy by using the characteristic similarity coefficients of the patients between the development stages of different similar matched patients.
Specifically, as shown in fig. 4, the method for identifying the determination of the accurate image features includes:
determining feature similarity coefficients between the history matching patient's wound image features at a particular stage of development of the wound and other stages of development based on the similarity of the wound image features between the history matching patient's wound image features at the particular stage of development and other stages of development;
determining a resolution accuracy of the history matching patient at the wound image feature based on feature similarity coefficients with other stages of development;
determining an average value of the accuracy of the wound image features by using the average value of the resolution accuracy of different historic matched patients on the wound image features, and determining whether the wound image features are identification accurate image features or not based on the average value of the accuracy.
Further, the resolution accuracy of the wound image features is determined based on the maximum value of the feature similarity coefficients between the other stages of development.
When the average value of the accuracy of the wound image features is larger than a preset accuracy threshold, the wound image features are determined to be the identification accurate image features.
Specifically, as shown in fig. 5, the method for determining the evaluation monitoring method of the wound is as follows:
according to the similarity conditions of the identification accurate image features in different development stages and other wound types, determining similarity coefficients of the identification accurate image features in different development stages and other wound types, determining identification deviation probabilities of different development stages by using the similarity coefficients, and determining comprehensive identification deviation probabilities of the wound based on the average value of the identification deviation probabilities of different development stages;
Determining the comprehensive distinguishing accuracy of the wound through distinguishing accuracy among different development stages;
And determining the identification reliability coefficient of the wound based on the ratio of the comprehensive distinguishing accuracy of the wound to the comprehensive identification deviation probability, and determining the evaluation monitoring method of the wound according to the identification reliability coefficient.
Further, the evaluation monitoring method for determining the wound according to the identification reliability coefficient specifically comprises the following steps:
when the identification reliability coefficient is smaller than a preset reliability coefficient threshold value, performing evaluation monitoring on the wound by utilizing the point cloud data;
And when the identification reliability coefficient is not smaller than a preset reliability coefficient threshold value, performing evaluation monitoring on the wound by using an image identification mode.
Embodiment 2 in a second aspect, the present invention provides a computer device comprising a memory and a processor in communication, and a computer program stored on the memory and capable of running on the processor, the processor executing a wound assessment monitoring method as described above when running the computer program.
In another embodiment, the method for identifying the determination of the accurate image features is as follows:
according to the similar situation of the wound image characteristics of the history matched patient in a specific development stage of the wound and the wound image characteristics between other development stages, determining characteristic similarity coefficients between the history matched patient and the other development stages, and determining that the wound image characteristics are accurate image characteristics when the characteristic similarity coefficients between the different history matched patient and the other development stages are smaller than a preset similarity threshold;
When there are history matching patients with feature similarity coefficients with other stages of development not less than a preset similarity threshold:
Acquiring the number of history matching patients with characteristic similarity coefficients not smaller than a preset similarity threshold in different other development stages, and determining that the wound image characteristics do not belong to identification accurate image characteristics when other development stages with the characteristic similarity coefficients not smaller than the preset similarity threshold and the number of history matching patients not meeting the requirements exist;
when there are no other stages of development in which the number of history-matched patients whose feature similarity coefficients are not less than the preset similarity threshold does not meet the requirements:
Determining the resolution accuracy of the history matching patient on the wound image features based on feature similarity coefficients with other development stages, and determining that the wound image features belong to identification accurate image features when the resolution accuracy of the history matching patient on the wound image features is greater than a preset accuracy threshold;
when the resolution accuracy of the history matching patient on the wound image features is not equal to or greater than a preset accuracy threshold value:
Taking a history matching patient with resolution accuracy not larger than a preset accuracy threshold as a resolution deviation patient, and determining that the wound image features belong to identification accurate image features when the number of the resolution deviation patients does not meet the requirement;
when the number of resolution deviation patients meets the requirement:
determining an average value of the accuracy of the wound image features by using the average value of the resolution accuracy of different historic matched patients on the wound image features, and determining whether the wound image features are identification accurate image features or not based on the average value of the accuracy.
In this specification, each embodiment is described in a progressive manner, and identical and similar parts of each embodiment are all referred to each other, and each embodiment mainly describes differences from other embodiments. In particular, for apparatus, devices, non-volatile computer storage medium embodiments, the description is relatively simple, as it is substantially similar to method embodiments, with reference to the section of the method embodiments being relevant.
The foregoing describes specific embodiments of the present disclosure. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims can be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing are also possible or may be advantageous.
The foregoing is merely one or more embodiments of the present description and is not intended to limit the present description. Various modifications and alterations to one or more embodiments of this description will be apparent to those skilled in the art. Any modification, equivalent replacement, improvement, or the like, which is within the spirit and principles of one or more embodiments of the present description, is intended to be included within the scope of the claims of the present description.
Claims (10)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202510002108.2A CN119399202B (en) | 2025-01-02 | 2025-01-02 | Wound assessment monitoring method and device |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202510002108.2A CN119399202B (en) | 2025-01-02 | 2025-01-02 | Wound assessment monitoring method and device |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| CN119399202A true CN119399202A (en) | 2025-02-07 |
| CN119399202B CN119399202B (en) | 2025-05-06 |
Family
ID=94426729
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202510002108.2A Active CN119399202B (en) | 2025-01-02 | 2025-01-02 | Wound assessment monitoring method and device |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN119399202B (en) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN120510159A (en) * | 2025-07-22 | 2025-08-19 | 中国人民解放军空军军医大学 | Intelligent analysis method for small program end wound image |
Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20220108447A1 (en) * | 2020-10-05 | 2022-04-07 | Hill-Rom Services, Inc. | Wound healing analysis and tracking |
| KR20220109062A (en) * | 2021-01-28 | 2022-08-04 | 한림대학교 산학협력단 | Artificial intelligence based electronic apparatus for providing diagnosis information on burns, control method, and computer program |
| CN115426939A (en) * | 2020-02-28 | 2022-12-02 | 光谱Md公司 | Machine learning systems and methods for wound assessment, healing prediction and treatment |
| US20230351596A1 (en) * | 2022-04-28 | 2023-11-02 | Chung Shan Medical University | Establishing method of wound grade assessment model, wound care assessment system and wound grade assessment method |
| JP2024051715A (en) * | 2022-09-30 | 2024-04-11 | キヤノン株式会社 | Image processing device, image processing method, and program |
| CN118787318A (en) * | 2024-09-13 | 2024-10-18 | 江南大学附属医院 | A method, system and device for dynamic monitoring of wound surface for burn treatment |
| CN119007977A (en) * | 2024-07-23 | 2024-11-22 | 东莞城市学院 | Auxiliary diagnosis method and system for traditional Chinese medicine based on artificial intelligent image recognition |
| US20240415447A1 (en) * | 2019-06-28 | 2024-12-19 | Resmed Inc. | Systems and methods for image monitoring |
-
2025
- 2025-01-02 CN CN202510002108.2A patent/CN119399202B/en active Active
Patent Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20240415447A1 (en) * | 2019-06-28 | 2024-12-19 | Resmed Inc. | Systems and methods for image monitoring |
| CN115426939A (en) * | 2020-02-28 | 2022-12-02 | 光谱Md公司 | Machine learning systems and methods for wound assessment, healing prediction and treatment |
| US20220108447A1 (en) * | 2020-10-05 | 2022-04-07 | Hill-Rom Services, Inc. | Wound healing analysis and tracking |
| KR20220109062A (en) * | 2021-01-28 | 2022-08-04 | 한림대학교 산학협력단 | Artificial intelligence based electronic apparatus for providing diagnosis information on burns, control method, and computer program |
| US20230351596A1 (en) * | 2022-04-28 | 2023-11-02 | Chung Shan Medical University | Establishing method of wound grade assessment model, wound care assessment system and wound grade assessment method |
| JP2024051715A (en) * | 2022-09-30 | 2024-04-11 | キヤノン株式会社 | Image processing device, image processing method, and program |
| CN119007977A (en) * | 2024-07-23 | 2024-11-22 | 东莞城市学院 | Auxiliary diagnosis method and system for traditional Chinese medicine based on artificial intelligent image recognition |
| CN118787318A (en) * | 2024-09-13 | 2024-10-18 | 江南大学附属医院 | A method, system and device for dynamic monitoring of wound surface for burn treatment |
Non-Patent Citations (1)
| Title |
|---|
| 刘君;李彩珍;杨冬云;胡孝海;刘素云;钟利霞;: "改良早期预警评分在创伤患者分诊中的应用及对预后的预测价值", 护理实践与研究, no. 15, 5 August 2017 (2017-08-05), pages 152 - 153 * |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN120510159A (en) * | 2025-07-22 | 2025-08-19 | 中国人民解放军空军军医大学 | Intelligent analysis method for small program end wound image |
Also Published As
| Publication number | Publication date |
|---|---|
| CN119399202B (en) | 2025-05-06 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN118213058B (en) | Medical instrument fault detection management system and method based on data analysis | |
| CN118759298B (en) | Multi-parameter monitoring analysis method and device for transformer and transformer | |
| CN113742387A (en) | Data processing method, device and computer readable storage medium | |
| CN119399202B (en) | Wound assessment monitoring method and device | |
| CN114692758A (en) | Power communication fault analysis method and device, terminal equipment and medium | |
| CN117972358A (en) | AI-based power distribution cabinet state identification method, device, equipment and storage medium | |
| CN113569656A (en) | Examination room monitoring method based on deep learning | |
| CN114331114A (en) | Intelligent supervision method and system for pipeline safety risks | |
| CN118861828A (en) | An operation and maintenance management system and method based on industrial Internet of Things | |
| CN115690590B (en) | Crop growth abnormity monitoring method, device, equipment and storage medium | |
| CN112534447B (en) | Method and apparatus for training machine learning routines for controlling engineering systems | |
| CN111666981A (en) | System data anomaly detection method based on genetic fuzzy clustering | |
| CN118507070B (en) | Training method and system for predicting model of hyperamylasemia after ERCP | |
| CN116956089A (en) | Training method and detection method for temperature anomaly detection model of electrical equipment | |
| CN119601209A (en) | A blood transfusion prediction method based on time series of patients' postoperative blood status | |
| CN118819104A (en) | Power plant equipment fault diagnosis method, device and equipment | |
| CN119153116A (en) | Schistosomiasis liver fibrosis patient early diagnosis data management system | |
| CN112434807A (en) | Deep learning model performance verification method and equipment | |
| CN118938733A (en) | A remote control inspection method based on AI operation and a power plant supervision system | |
| CN116211279A (en) | An information collection and analysis system for extracorporeal membrane lung care | |
| CN120030500B (en) | Equipment fault analysis and prediction system and method based on frequency domain characteristics | |
| KR102831804B1 (en) | Apparatus and method for detecting machine equipment abnormalities using data clustering and human-in-the-loop process | |
| CN119517405B (en) | Heart attack early warning system based on image recognition | |
| Vychuzhanin et al. | Validation of a Neural Network Architecture for Approximating an Analytical Model of Eye Condition | |
| CN109491844A (en) | A kind of computer system identifying exception information |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| GR01 | Patent grant | ||
| GR01 | Patent grant |