[go: up one dir, main page]

WO2025115714A1 - Dispositif d'inférence d'informations biologiques, procédé d'inférence d'informations biologiques et programme - Google Patents

Dispositif d'inférence d'informations biologiques, procédé d'inférence d'informations biologiques et programme Download PDF

Info

Publication number
WO2025115714A1
WO2025115714A1 PCT/JP2024/041090 JP2024041090W WO2025115714A1 WO 2025115714 A1 WO2025115714 A1 WO 2025115714A1 JP 2024041090 W JP2024041090 W JP 2024041090W WO 2025115714 A1 WO2025115714 A1 WO 2025115714A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
color
unit
subject
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/JP2024/041090
Other languages
English (en)
Japanese (ja)
Inventor
裕樹 三好
崇人 千葉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Toppan Holdings Inc
Original Assignee
Toppan Holdings Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Toppan Holdings Inc filed Critical Toppan Holdings Inc
Publication of WO2025115714A1 publication Critical patent/WO2025115714A1/fr
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/75Systems in which material is subjected to a chemical reaction, the progress or the result of the reaction being investigated
    • G01N21/77Systems in which material is subjected to a chemical reaction, the progress or the result of the reaction being investigated by observing the effect on a chemical indicator
    • G01N21/78Systems in which material is subjected to a chemical reaction, the progress or the result of the reaction being investigated by observing the effect on a chemical indicator producing a change of colour
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N33/00Investigating or analysing materials by specific methods not covered by groups G01N1/00 - G01N31/00
    • G01N33/48Biological material, e.g. blood, urine; Haemocytometers
    • G01N33/483Physical analysis of biological material
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/30ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment

Definitions

  • the present invention relates to a biometric information estimation device, a biometric information estimation method, and a program.
  • Patent Literature 1 describes a health condition estimation system that estimates a subject's health condition by analyzing an image of a nail.
  • the health condition estimation system acquires an image including the subject's nail, extracts features from the image, and estimates the subject's health condition by machine learning based on the extracted features.
  • Patent Document 2 describes a urine testing instrument that captures an image of a urine testing reagent by a digital camera.
  • the urine testing instrument is provided with a through-hole portion for inserting the lens barrel portion of the digital camera into the opening of a container with the urine testing reagent provided on the inner bottom surface.
  • the digital camera is fixed to the urine testing instrument to capture an image of the judgment criteria display and the urine testing reagent after the color reaction, and after the image is captured, the captured image is displayed on a display monitor, allowing the color of the judgment criteria display and the color of the urine testing reagent to be compared and examined.
  • Patent Document 3 describes a method in which multiple types of color change information of a color change part that changes to multiple types of color tones upon contact with urine, and multiple types of color change reference information that indicate a correspondence with the measured values of biological information, are both captured by an imaging device, and the color change information captured by the imaging device is compared with the color change reference information to identify the color change reference information that corresponds to the color change information.
  • Patent Literature 1 uses images of the human body to estimate the health condition, but the camera device that captures the image and the surrounding environment in which the image is captured vary depending on the subject. If the camera device or surrounding environment is different, the color information of the image will differ, making it difficult to estimate the health condition with high accuracy.
  • the inspection described in Patent Document 2 the inspection is performed using images captured by an imaging device. However, depending on the model and performance of the digital camera, it may not be possible to obtain an image with accurate color, and the inspection may not be performed accurately.
  • Patent Document 3 it is necessary to capture color tone change information and color tone change reference information simultaneously, and it is necessary to prepare color tone change reference information for each capture.
  • the color tone change information and color tone change reference information may change depending on the surrounding environment, and, as in Patent Document 1, it may not be possible to obtain an accurate color image depending on the model and performance of the digital camera.
  • This disclosure has been made in consideration of these circumstances, and aims to provide a biometric information estimation device, a biometric information estimation method, and a program that can obtain color information from a captured image and estimate biometric information with high accuracy.
  • a bioinformation estimation device comprising an image acquisition unit that acquires an image of a subject, a preprocessing unit that performs a predetermined preprocessing on the captured image, an extraction unit that extracts color information of the subject from the captured image preprocessed by the preprocessing unit, an estimation unit that estimates a bioinformation value that is a value indicating bioinformation based on the color information and subject information that indicates information about the subject, and an output unit that outputs the bioinformation value.
  • Another aspect of the present disclosure is a method for estimating bioinformation, including the steps of: acquiring an image of a subject by a computer; performing a predetermined preprocessing on the image; extracting color information of the subject from the preprocessed image; estimating a bioinformation value, which is a value indicating bioinformation, based on the color information and subject information indicating information of the subject; and outputting the bioinformation value.
  • Another aspect of the present disclosure is a bioinformation estimation program that causes a computer to execute the steps of acquiring an image of a subject, performing a predetermined preprocessing on the captured image, extracting color information of the subject from the preprocessed captured image, estimating a bioinformation value that is a value indicating bioinformation based on the color information and subject information indicating information about the subject, and outputting the bioinformation value.
  • color information can be obtained from a captured image to estimate biometric information with high accuracy.
  • FIG. 13 is a block diagram showing a configuration example of a biological information estimation system according to a first embodiment.
  • FIG. 13 is a block diagram showing a configuration example of a biological information estimation system according to a second embodiment.
  • FIG. 13 is a sequence diagram illustrating an example of a processing procedure of the biological information estimation system according to the second embodiment.
  • FIG. 13 illustrates an example of a color information acquisition unit according to the second embodiment;
  • FIG. 13 is a block diagram showing an example of a biological information estimation system according to a second embodiment.
  • FIG. 13 is a block diagram showing an example of a biological information estimation system according to a second embodiment.
  • FIG. 13 is a block diagram showing a configuration example of an inspection system according to a third embodiment.
  • FIG. 13 is a block diagram showing a configuration example of an inspection system according to a third embodiment.
  • FIG. 13 is a sequence diagram illustrating an example of a processing procedure of the inspection system according to the third embodiment.
  • FIG. 13 is a diagram showing an example of a display screen of a terminal device according to the third embodiment.
  • FIG. 13 illustrates an example of an image color correction unit according to the third embodiment.
  • FIG. 13 is a block diagram illustrating an example of an inspection system according to a third embodiment.
  • biometric information estimation device biometric information estimation method, and program to which the present invention is applied will be described with reference to the drawings.
  • (First embodiment) 1 is a block diagram showing an example of a configuration of a biological information estimation system 1 according to a first embodiment.
  • the biological information estimation system 1 is a system in which a captured image is transmitted to a biological information estimation device 100 disposed remotely from a subject, and the biological information estimation device 100 acquires color information from the captured image to estimate biological information with high accuracy.
  • the specimen is a part of the subject's body, a reagent reacted with a user's bodily fluid, etc.
  • the subject is a user of the terminal device 200, a patient, etc.
  • Biometric information is various numerical values related to the subject's body. Biometric information is values directly measured from the living body, such as body temperature, blood pressure, oxygen saturation, pulse rate, etc., or test values obtained by reacting bodily fluids with a reagent, etc.
  • the bioinformation estimation system 1 includes, for example, a bioinformation estimation device 100 and a terminal device 200 equipped with a camera device 210.
  • the bioinformation estimation device 100 and the terminal device 200 are connected, for example, via a communication network.
  • the bioinformation estimation device 100 and the terminal device 200 have a communication interface, such as a NIC (Network Interface Card) or a wireless communication module, for connecting to a network such as the Internet.
  • the communication network may include, for example, a general-purpose network such as the Internet, and a private network such as local 5G or WiFi (registered trademark).
  • the terminal device 200 is an information processing device used by a user, such as a smartphone or a personal computer.
  • the terminal device 200 includes a camera device 210, an input unit 220, and a display unit 230.
  • the camera device 210 operates based on the operation received by the input unit 220, and generates a captured image of the subject.
  • the input unit 220 is a device that accepts user operations, such as a touch sensor built into the display unit 230.
  • the display unit 230 is a device that displays various types of information, such as an organic EL display.
  • the terminal device 200 accepts user operations, transmits the captured image generated by the camera device 210 to the biometric information estimation device 100, and displays the biometric information (test results) transmitted from the biometric information estimation device 100.
  • the terminal device 200 and the biological information estimation device 100 are separate entities, but this is not limited thereto, and the functions of the biological information estimation device 100 may be built into the terminal device 200 .
  • the camera device 210 may be a camera built into a smartphone, a single-lens reflex camera, a web camera, etc.
  • the terminal device 200 may transmit other information such as metadata indicating camera-related information such as the model of the camera of the terminal device 200 along with the captured image.
  • the captured image may be a still image or a moving image, or may be a plurality of images generated by interval shooting.
  • the captured image may be an image of a part of the patient.
  • the part of the patient may be, for example, a nail, a back of a hand, a face, lips, a conjunctiva, an eye, an eyelid, a tongue, etc.
  • the captured image may be an image of a plurality of parts of the patient.
  • the captured image may be a still image of the reagent or a moving image of the reagent.
  • the captured image may be, for example, a plurality of images generated by taking interval photographs every elapsed second in order to observe the reaction of the reagent.
  • the captured image may be a plurality of images captured by changing the subject distance, the presence or absence of a flash, the angle, etc.
  • the captured image may be an image of a patient and a reference object such as a color chart for color correction, a white board, etc.
  • the captured image may include both an image including a patient and an image of a reference object.
  • the captured image may include depth information measured using a depth sensor or the like.
  • the bioinformation estimation device 100 is an information processing device that communicates with other devices and performs various processes.
  • the bioinformation estimation device 100 includes, for example, an image acquisition unit 110, a preprocessing unit 120, an extraction unit 130, an estimation unit 140, an output unit 150, and a memory unit 160.
  • the image acquisition unit 110, the preprocessing unit 120, the extraction unit 130, the estimation unit 140, and the output unit 150 are functional units that are realized by a processor such as a CPU (Central Processing Unit) executing a bioinformation estimation program stored in a program memory.
  • a processor such as a CPU (Central Processing Unit) executing a bioinformation estimation program stored in a program memory.
  • CPU Central Processing Unit
  • the image acquisition unit 110 acquires an image of the subject.
  • the preprocessing unit 120 performs a predetermined preprocessing on the captured image.
  • the preprocessing unit 120 may perform the preprocessing by referring to the preprocessing setting information stored in the storage unit 160.
  • the preprocessing setting information is, for example, information that specifies the processing procedure of the preprocessing, processing rules such as calculation formulas, programs, and machine learning models.
  • the pre-processing unit 120 may perform color correction processing to correct the color of the object contained in the captured image so that it approaches the color of the actual subject.
  • the pre-processing unit 120 corrects the image so that it has the "correct color" of the subject in the captured image.
  • the correct color refers to a color that is correct in terms of colorimetry (a defined color value measured by a colorimeter).
  • the pre-processing unit 120 may perform image processing useful for estimating a biometric information value, such as illuminance unevenness correction (shading correction), shadow correction, sharpness correction, contrast correction, etc.
  • the pre-processing unit 120 may perform processing such as enlarging, reducing, adjusting the shape, and extracting objects on the captured image, as long as the processing is useful for estimating a biometric information value.
  • the extraction unit 130 extracts color information of the subject from the captured image that has been preprocessed by the preprocessing unit 120.
  • the extraction unit 130 may extract color information by referring to the extraction setting information stored in the storage unit 160.
  • the extraction setting information is information for the extraction unit 130 to perform the color information extraction process, and is, for example, information indicating the part to be extracted or information indicating a learned spectral reflectance estimation model. This allows the extraction unit 130 to extract color information corresponding to each piece of biological information, even when estimating multiple pieces of biological information, for example.
  • the color information is information such as colorimetric values that indicate the color of the patient to estimate biological information.
  • the color information may be RGB values that have been white-balanced, RGB values reproduced in a specific color space such as sRGB values, etc.
  • the color information may be spectral reflectance, XYZ values in the XYZ color system, L*a*b* values in the L*a*b* color system, erythema index, melanin index, absorbance, a spectrum of a specific wavelength, a representative value of a specific area of a subject, or time-series information that indicates changes over time in a video with a value for each pixel or an image captured at intervals, and may further include two or more of these pieces of information.
  • the color information may be a single value such as an average value in the area to be extracted, or may be image data (two-dimensional distribution of color information) of a small area including the area to be extracted.
  • the estimation unit 140 estimates a bioinformation value, which is a value indicating the bioinformation, based on the color information and the subject information.
  • the subject information is information indicating information on the subject whose image has been captured. Specifically, the subject information is information indicating a part of a patient and information indicating a reagent.
  • the estimation unit 140 may estimate the bioinformation using a method such as multiple regression analysis, polynomial approximation, or machine learning.
  • the estimation unit 140 may estimate the biological information by referring to the estimation setting information stored in the storage unit 160.
  • the estimation setting information is, for example, information that identifies a processing rule such as a processing procedure or a calculation formula for estimation, a program, or a machine learning model.
  • the output unit 150 outputs the biometric information value.
  • the output unit 150 may transmit display data for displaying the biometric information value on the terminal device 200.
  • the output unit 150 may output the biometric information to a device that stores the biometric information, for example, or may output the biometric information to an inspection device that uses the biometric information to perform various inspections.
  • the output unit 150 may output multiple pieces of biometric information simultaneously.
  • the storage unit 160 stores pre-processing setting information, extraction setting information, and estimation setting information.
  • the pre-processing setting information is, for example, information for correcting color information without using a color chart.
  • the pre-processing setting information is, for example, information indicating the colorimetric values of each patch on a color chart when color correction is performed by the pre-processing unit 120 using a color chart.
  • the storage unit 160 may store, as pre-processing setting information, information on white unevenness such as a previously captured white paper image used for color unevenness (shading) correction, information on black unevenness such as a previously captured black paper image used for shadow correction, and parameters for optimizing sharpness and contrast enhancement processing for each target when there are multiple targets for biometric information estimation.
  • the storage unit 160 stores estimated setting information.
  • the estimated setting information is conversion information for estimating biological information when the specimen is a reagent. This conversion information is information showing a correspondence table between color information of the captured image for each reagent and the actual color of the reagent.
  • the estimated setting information is also conversion information for estimating biological information when the specimen is a body part, a blood component, or the like. This conversion information is information showing a correspondence table between color information of the captured image for each nail or blood component and the actual color of the reagent, for example.
  • the biometric information estimation system 1 may include a biometric information identification unit that identifies the biometric information estimated by the estimation unit 140.
  • the biometric information identification unit may be the input unit 220 that acquires information that identifies the biometric information in response to a user's operation.
  • the biometric information identification unit may be the image acquisition unit 110 that acquires information that identifies the biometric information input in response to a user's operation.
  • the pre-processing unit 120 performs color correction of the subject corresponding to the biometric information identified by the biometric information identification unit.
  • the subject corresponding to the biometric information is a subject suitable for the identified biometric information.
  • the subject suitable for the biometric information is, for example, a part of a patient's body for measuring blood pressure, or a urine test strip for measuring urine test values.
  • the bioinformation estimation system 1 may include a subject identification unit that identifies the subject.
  • the subject identification unit may be an input unit 220 that acquires information that identifies the subject in response to a user's operation.
  • the subject identification unit may be an image acquisition unit 110 that acquires information that identifies the subject input in response to a user's operation.
  • the preprocessing unit 120 performs color correction on the captured image in accordance with the subject identified by the subject identification unit.
  • the extraction unit 130 estimates a bioinformation value based on the subject information that corresponds to the subject identified by the subject identification unit.
  • the preprocessing unit 120 may switch conversion information indicating the correspondence between the color information of the captured image and the colorimetric values of the subject based on the type of subject identified by the subject identification unit.
  • the conversion information is stored in the storage unit 160 as preprocessing setting information.
  • the conversion information is information for converting the color information of the captured image, and includes information indicating the correspondence between the pixel values of the captured image and the colorimetric values of the actual subject.
  • the correspondence between the pixel values of the captured image and the colorimetric values of the actual subject depends on, for example, the color characteristics of the subject. Therefore, the conversion information is prepared for each subject according to the color characteristics of the subject.
  • the color characteristics of the subject are the tendency of the spectral reflectance.
  • the color characteristics of the subject may be predicted by a learned spectral reflectance estimation model obtained by learning the tendency of the spectral reflectance of the subject.
  • the preprocessing unit 120 may estimate the spectral reflectance by inputting the captured image into a machine learning model (trained spectral reflectance estimation model) trained on the color information of the captured image specialized for the subject and the spectral reflectance of the actual object, and may perform color correction of the captured image based on the color measurement value of the subject obtained from the estimated spectral reflectance.
  • the trained spectral reflectance estimation model is a machine learning model trained on the color information of the captured image specialized for the subject and the color (spectral reflectance, correct value) of the actual object, and the use of the trained spectral reflectance estimation model makes it possible to estimate the spectral reflectance with high accuracy.
  • the trained spectral reflectance estimation model may be a general-purpose trained spectral reflectance estimation model trained on a data set that collects general captured images and color information (correct value) of the object without specifying the object. From the spectral reflectance estimated in this way, the color (color measurement value) of the object can be obtained by providing the spectral distribution of the observation illumination and a color matching function.
  • the machine learning model for estimating the spectral reflectance may be a neural network, a method based on statistical analysis such as principal component analysis, or a machine learning algorithm such as multiple regression analysis.
  • the pre-processing unit 120 may estimate related information about imaging from the captured image and perform color correction based on the estimated related information. For example, the pre-processing unit 120 may estimate the illumination color (or light source color or spectral distribution) around the subject as related information and perform color correction based on the estimated illumination color.
  • the pre-processing unit 120 can estimate the illumination color (or light source color or spectral distribution) around the subject by referring to parameters required for estimating the illumination color from the captured image and estimating the time of imaging, outdoors or indoors, based on time information and latitude and longitude information added to the captured image.
  • the pre-processing unit 120 for example, refers to conversion information prepared according to the estimated illumination color and corrects the color information of the captured image to approach the actual color of the subject.
  • the pre-processing unit 120 may perform color correction based on camera model information included in the metadata of the captured image as related information.
  • the camera model information includes camera color characteristic information such as the camera's spectral sensitivity and gradation characteristics according to the camera model.
  • the pre-processing unit 120 refers to conversion information prepared according to the camera model information extracted from the metadata and corrects the color information of the captured image to be closer to the actual color of the subject.
  • the pre-processing unit 120 estimates the spectral reflectance from the camera model information using the camera spectral sensitivity for each camera model stored in the storage unit 160, and can switch the conversion information from the estimated spectral reflectance.
  • the pre-processing unit 120 may estimate a spectral information image indicating the spectral reflectance of the subject from an RGB image, which is an image of the subject, using a machine learning model.
  • the pre-processing unit 120 inputs the image, the spectral sensitivity of the camera device 210 that captured the image, and the spectral distribution of the light source in the environment in which the image was captured into the machine learning model, and estimates the spectral information image. This allows the pre-processing unit 120 to perform pre-processing on the captured image, taking into account the spectral sensitivity of the camera device 210 and the light source in the environment in which the image was captured.
  • the pre-processing unit 120 may perform color correction based on registration information received based on user operation.
  • the registration information may be, for example, a lighting spectral distribution indicating the lighting color, etc., and the pre-processing unit 120 may perform color correction based on the lighting color.
  • the pre-processing unit 120 refers to conversion information that converts the color information of the captured image corresponding to the lighting color into the color of the actual subject, and corrects the color information of the captured image to be closer to the color of the actual subject.
  • the registered information is terminal information, and the pre-processing unit 120 may perform color correction based on camera model information corresponding to the terminal information.
  • the pre-processing unit 120 refers to conversion information that converts the color information of the captured image corresponding to the registered camera model information into the color of the actual subject, and corrects the color information of the captured image to be closer to the color of the actual subject.
  • the subject information may indicate an operation that induces a change over time in the subject, and the estimation unit 140 may estimate the bioinformation value based on a change in color information corresponding to the change over time in the subject.
  • the operation that induces a change over time in the subject induces a color change or color emphasis in a part of the subject.
  • the terminal device 200 prompts the patient to perform an operation to induce a change over time before or during the capture of the captured image, and transmits the subject information to the bioinformation estimation device 100.
  • the subject information may be information indicating an action such as compression.
  • the action information may be information indicating an action such as before and after administration of a drug, before and after application of a drug, before and after application of an electrical stimulus, before and after heating, before and after cooling, changes in indoor temperature and humidity, changes in oxygen concentration, before and after irradiation of a light of a specific wavelength, before and after administration of an allergen, etc.
  • the terminal device 200 may control the camera device 210 to display a message to capture an image of the patient's nail, display a message to compress the nail while the nail is being captured, start capturing moving images before and after compression, release the compression, and continue capturing moving images after the compression is released.
  • the bioinformation estimation device 100 obtains the action of compressing a part of the skin as subject information, and the terminal device 200 can transmit the moving image when the part of the skin is compressed and the subject information to the bioinformation estimation device 100.
  • the subject information may also be information that measures the time it takes for the color of the nail to return to the color before the image was captured.
  • the estimation unit 140 estimates the biometric information based on the subject information and the color information. Specifically, the estimation unit 140 transmits subject information indicating the action of compressing and releasing the nail to the terminal device 200, and the camera device 210 transmits a captured image capturing the process of compressing the nail to releasing it to the biometric information estimation device 100. The extraction unit 130 acquires color information from the captured image, and the estimation unit 140 estimates the biometric information from the subject information and the color information.
  • the subject information is the biometric information of the individual user, and the estimation unit 140 may estimate the biometric information value based on the color information and the biometric information.
  • the biometric information of the individual user is information such as the user's gender, age, medical history, and biometric information value from the previous test.
  • the subject information is the user's personal bio-related information or environmental information, and the estimation unit 140 may estimate the bio-information value based on the color information and the bio-related information.
  • the biological information is a correspondence table of test values for each test item and the colors of the reaction reagents corresponding to the test items.
  • Test items are, for example, glucose test values and pH values.
  • the colors of the reaction reagents are, for example, colorimetric values XYZ and L*a*b*.
  • the correspondence table includes, for example, colorimetric values "50, 4, -1" for a test value of "-" for glucose, and colorimetric values "30, 2, -5" for a test value of "1+".
  • the correspondence table includes, for example, colorimetric values "20, 0, -2" for a test value of "6.5” for pH values, colorimetric values "40, 0, 2,” for a test value of "7.0,” and colorimetric values "60, 2, -5" for a test value of "7.5.”
  • the bio-related information is, for example, a correspondence table between bio-information and the color of the body part corresponding to the bio-information.
  • the color of the reaction reagent is, for example, colorimetric values XYZ, L*a*b*, etc.
  • the correspondence table includes, for example, when the body part is a nail and the bio-information is "hemoglobin", the colorimetric value for the test value "6.0” is "20,0,-2", the colorimetric value for the test value "8.0” is "40,0,2”, and the colorimetric value for the test value "10.0" is "60,2,-5".
  • the correspondence table includes values such as the colorimetric value for the test value "100” is “20,0,-2”, the colorimetric value for the test value "98” is “40,0,2”, and the colorimetric value for the test value "96".
  • the estimation unit 140 can obtain colorimetric values as color information from the extraction unit 130, and obtain test values corresponding to the colorimetric values. For example, if the color information obtained from the extraction unit 130 is colorimetric values "50, 5, -1", it can classify and obtain the test value "-" that corresponds to the colorimetric value "50, 4, -1" that is closest in Euclidean distance from the correspondence table.
  • the estimation unit 140 may also convert the test value into a continuous value by performing regression processing.
  • the classification or regression processing of the test value in the estimation unit 140 may be a machine learning method such as logistic regression analysis, multiple regression analysis, support vector machine, or neural network.
  • the image acquisition unit 110 acquires an image of the subject and a color chart image of a color chart
  • the pre-processing unit 120 may refer to conversion information that converts color information contained in the image of the subject using the color chart image, and correct the color information contained in the image based on the conversion information.
  • the color chart image has areas corresponding to multiple patches.
  • the pre-processing unit 120 estimates the actual color of the subject by comparing the color information contained in the color chart image with the color information of the captured image and identifying the area in the color chart to be matched.
  • the bioinformation estimation device 100 may include a health condition estimation unit that estimates a health condition based on the bioinformation value estimated by the estimation unit 140.
  • the health condition estimation unit may be implemented, for example, as one function in the estimation unit 140.
  • the health condition estimation unit estimates the user's health condition based on the bioinformation value estimated by the estimation unit 140.
  • the health condition estimation unit may estimate a disease name as the health condition.
  • the health condition estimation unit for example, refers to table information that associates bioinformation values with health conditions stored in the memory unit 160.
  • the first embodiment of the bioinformation estimation system 1 can perform a predetermined preprocessing on a captured image, extract color information of the subject from the preprocessed captured image, and estimate a bioinformation value based on the color information and the subject information.
  • This bioinformation estimation system 1 can extract preprocessed color information and estimate bioinformation simply by capturing an image. As a result, the bioinformation estimation system 1 can estimate bioinformation with high accuracy. Furthermore, the bioinformation estimation system 1 can reduce the effort required for a user to undergo medical treatment or testing procedures.
  • Second Embodiment 2 is a block diagram showing a configuration example of a biological information estimation system 1A according to the second embodiment.
  • the biological information estimation system 1A is a system in which a captured image is transmitted to a biological information estimation device 100A disposed remotely from a living body such as a patient, and the biological information estimation device 100A acquires color information from the captured image and estimates the biological information with high accuracy.
  • the subject whose biological information is to be estimated is a patient undergoing a health check, but is not limited thereto and may be a human body or another living body.
  • the bioinformation estimation system 1A includes, for example, a bioinformation estimation device 100A and a terminal device 200A equipped with a camera device 210A.
  • the bioinformation estimation device 100A and the terminal device 200A are connected, for example, via a communication network.
  • the bioinformation estimation device 100A and the terminal device 200A have a communication interface such as a NIC (Network Interface Card) or a wireless communication module for connecting to a network such as the Internet.
  • the communication network may include, for example, a general-purpose network such as the Internet, and a private network such as local 5G or WiFi (registered trademark).
  • the terminal device 200A is an information processing device such as a smartphone or personal computer used by the patient.
  • the terminal device 200A is equipped with a camera device 210A.
  • the camera device 210A operates based on an operation received by the terminal device 200A, and generates a captured image of the patient.
  • the terminal device 200A transmits the captured image to the bioinformation estimation device 100.
  • the terminal device 200A and the bioinformation estimation device 100A are separate entities, but this is not limited thereto, and the functions of the bioinformation estimation device 100A may be built into the terminal device 200A.
  • the camera device 210A may be a camera built into a smartphone, a single-lens reflex camera, a web camera, etc.
  • the terminal device 200A may transmit information indicating the type of camera together with the captured image.
  • the captured image may be a still image or a moving image, or may be a plurality of images generated by interval shooting.
  • the captured image may be an image of a part of the patient.
  • the part of the patient may be, for example, a nail, a back of a hand, a face, lips, a conjunctiva, an eye, an eyelid, a tongue, etc.
  • the captured image may be an image of a plurality of parts of the patient.
  • the captured images may be a plurality of images captured by changing the subject distance, the use of a flash, the angle, and the like.
  • the captured image may be an image of a patient and a reference object such as a color chart for color correction, a white board, etc.
  • the captured image may include both an image including a patient and an image of a reference object.
  • the captured image may include depth information measured using a depth sensor or the like.
  • the bioinformation estimation device 100A is an information processing device that communicates with other devices and performs various processes.
  • the bioinformation estimation device 100A includes, for example, an image acquisition unit 110A, a color information acquisition unit 120A, and a bioinformation estimation unit 130A.
  • the image acquisition unit 110A, the color information acquisition unit 120A, and the bioinformation estimation unit 130A are functional units that are realized by a processor such as a CPU (Central Processing Unit) executing a program stored in a program memory.
  • a processor such as a CPU (Central Processing Unit) executing a program stored in a program memory.
  • CPU Central Processing Unit
  • the image acquisition unit 110A acquires images of the patient.
  • the color information acquisition section 120A corrects the color information included in the captured image and acquires the corrected color information.
  • the color information is information indicating the color of the patient for estimating biological information.
  • the color information may be RGB values reproduced in a specific color space such as RGB values subjected to white balance correction, sRGB values, etc.
  • the color information may be spectral reflectance, XYZ values in the XYZ color system, L*a*b* values in the L*a*b* color system, erythema index, melanin index, absorbance, a spectrum of a specific wavelength, a representative value of a specific area of a subject, or time-series information indicating a change over time in an image when a video having a value for each pixel or interval photography is performed, and may further include two or more of these pieces of information.
  • the color information may be a single value such as an average value in the part to be extracted, or may be image data (distribution of two-dimensional color information) of a small area including the part to be extracted.
  • the biometric information estimation unit 130A estimates the patient's biometric information based on the color information.
  • the biometric information estimation unit 130A estimates the biometric information using methods such as multiple regression analysis, polynomial approximation, or machine learning.
  • the biometric information estimation device 100A outputs the biometric information.
  • the biometric information estimation device 100A may output the biometric information to the terminal device 200A, for example, may output the biometric information to a device that stores the biometric information, or may output the biometric information to an inspection device that performs various inspections using the biometric information. Note that the biometric information estimation unit 130A may simultaneously estimate multiple pieces of biometric information based on color information.
  • FIG. 3 is a sequence diagram showing an example of a processing procedure of the biological information estimation system 1A according to the second embodiment.
  • the terminal device 200A captures an image of the patient using the camera device 210A (step ST10) and transmits the captured image to the biometric information estimation device 100A.
  • the image acquisition unit 110A acquires the captured image and outputs the captured image to the color information acquisition unit 120A.
  • the color information acquisition unit 120A corrects color information contained in the captured image and acquires the corrected color information (step ST12).
  • the color information acquisition unit 120A outputs the acquired color information to the biometric information estimation unit 130A.
  • the biometric information estimation unit 130A estimates the biometric information of the patient based on the color information (step ST14).
  • the biometric information estimation device 100A transmits the biometric information estimated by the biometric information estimation unit 130A to the terminal device 200A.
  • the terminal device 200A presents the biometric information to the patient by displaying it using the biometric information (step ST16).
  • FIG. 4 is a diagram illustrating an example of the color information acquisition unit 120A according to the second embodiment.
  • the color information acquisition unit 120A may include at least one of a correction unit 121A, a conversion unit 122A, and an estimation unit 123A, 124A, and 125A.
  • the correction unit 121A, the conversion unit 122A, and the estimation units 123A, 124A, and 125A are functional units realized by a processor such as a CPU executing a program stored in a program memory.
  • the correction unit 121A estimates the lighting color of the space in which the patient was imaged based on the captured image, and corrects the white balance of the RGB information contained in the captured image.
  • the correction unit 121A outputs the captured image including the color information with the white balance corrected to the bioinformation estimation unit 130A.
  • the image acquisition unit 110A acquires a color chart image of a color chart and a captured image of a patient, and the conversion unit 122A creates conversion information for converting color information contained in the captured image of the patient using the color chart image, and converts the color information contained in the captured image based on the conversion information.
  • the color chart is, for example, a plate-shaped object on which color samples are arranged.
  • the conversion unit 122A creates a color conversion matrix (conversion information) based on the relationship between the colors indicated by the color chart and the colors of the color chart included in the captured image.
  • the conversion unit 122A creates a color conversion matrix using the color chart, for example, by performing multiple regression analysis.
  • the conversion unit 122A can convert the captured image of the patient based on the conversion information to the colors indicated by the color chart.
  • the conversion unit 122A outputs the captured image of the converted color information to the bioinformation estimation unit 130A.
  • the estimation unit 123A estimates color information based on output information of a machine learning engine that estimates parameters that affect the color of a captured image when a patient is captured.
  • the parameters that affect the color of the captured image are, for example, spectral reflectance and absorbance.
  • the machine learning engine is a parameter estimation engine that is trained using the captured image and parameters (spectral reflectance and absorbance) as learning data.
  • the parameter estimation engine estimates parameters in response to input of the captured image.
  • the estimation unit 123A estimates color information of the captured image using the parameters estimated by the parameter estimation engine.
  • the estimation unit 123A may generate other parameters such as an erythema index and a melanin index using the absorbance estimated by the parameter estimation engine.
  • the estimation unit 123A estimates color information of the captured image using other parameters generated using the parameters estimated by the parameter estimation engine.
  • the estimation unit 124A estimates correct color information based on the captured image acquired by the image acquisition unit 110A and color information about the camera that acquired the captured image, such as the spectral sensitivity.
  • the terminal device 200A transmits camera information related to color information, such as the type and specifications (sensitivity, etc.) of the camera device 210A, along with the captured image to the bioinformation estimation device 100.
  • the estimation unit 124A estimates the patient's color information based on the camera information included in the captured image.
  • the estimation unit 125A may estimate color information based on the part information acquired by the image acquisition unit 110A, or on the output information of a machine learning engine that recognizes the patient's part.
  • the part information is information indicating a part specified by the user, such as the nail, back of the hand, face, or conjunctiva.
  • the machine learning engine that recognizes the patient's part is a part estimation engine that has been trained using the captured image and part information for identifying the part as learning data.
  • the part estimation engine estimates the part in response to inputting the captured image.
  • the estimation unit 125A estimates color information of the captured image in response to the part indicated by the part information or the part estimated by the part estimation engine.
  • an image of a patient is acquired, color information contained in the image is corrected, and the patient's biometric information is estimated based on the corrected color information. Therefore, even if the image varies depending on the camera device 210A that captured the image and the environment around the patient, color information can be acquired from the image and biometric information can be estimated with high accuracy by correcting the color information.
  • FIG. 5 is a block diagram showing an example of a biological information estimation system 1B according to the embodiment.
  • the bioinformation estimation system 1B includes a bioinformation detection unit 220A that acquires bioinformation detected from a patient.
  • the bioinformation detection unit 220A may be a device that measures the bioinformation of a patient, such as a smart watch, or may be a device that accepts an operation of a patient or the like to input bioinformation.
  • the bioinformation detection unit 220A detects, for example, bioinformation different from the bioinformation estimated by the bioinformation estimation unit 130B.
  • the bioinformation detected by the bioinformation detection unit 220A is bioinformation for assisting the bioinformation estimation unit 130B in estimating the bioinformation.
  • the bioinformation detected by the bioinformation detection unit 220A may be body temperature, oxygen saturation, pulse, blood pressure, electrocardiogram, etc., may be oxygen saturation measured by a pulse oximeter, or may be age, sex, blood type, height, weight, disease, etc. acquired by manual input. Furthermore, the biological information detection unit 220A may be communicatively connected to a server device and may acquire blood test results (hemoglobin), urine test results, various test results (blood pressure, melanin, electrocardiogram, etc.), medical history, etc. from the server device. The biological information detection unit 220A may estimate biological information from a captured image acquired by the camera device 210A or an captured image acquired by other means through internal processing. The biological information detection unit 220A may estimate, for example, burns, trauma, melanin, etc. from a captured image.
  • the bioinformation estimation unit 130B estimates the bioinformation based on the biodetection information and color information acquired from the bioinformation detection unit 220A. Specifically, the bioinformation detection unit 220A transmits an image of the nail to the bioinformation estimation device 100B. The bioinformation detection unit 220A transmits the oxygen saturation acquired by a pulse oximeter or a smartwatch to the bioinformation estimation device 100B as the biodetection information. The color information acquisition unit 120A performs color correction on the image of the nail acquired by the image acquisition unit 110A to sRGB values and averages the RGB values of the nail. The bioinformation estimation unit 130B estimates the hemoglobin value using polynomial approximation from the color information corrected by the color information acquisition unit 120A and the oxygen saturation transmitted from the bioinformation detection unit 220A.
  • biometric information estimation device 100B in addition to the captured image of the patient, biometric detection information obtained by the biometric information detection unit 220A is acquired, and the patient's biometric information can be estimated based on the corrected color information and the biometric detection information.
  • FIG. 6 is a block diagram showing an example of a biological information estimation system 1C according to the second embodiment.
  • the biological information estimation system 1C includes a motion information acquisition unit 230A that acquires motion information of the patient when the captured image is captured.
  • a color information acquisition unit 120C acquires color information of the patient based on the captured image and the motion information
  • a biological information estimation unit 130C estimates the biological information based on the color information.
  • the patient's motion information when the captured image is captured is information indicating the motion that induces a color change or color emphasis in the patient.
  • the motion information acquisition unit 230A may prompt the patient to perform an operation to induce a change over time before or during the capture of the captured image.
  • the motion information acquisition unit 230A may, for example, display a message prompting the patient to perform an action on a screen on which the captured image is captured by the camera device 210A, and acquire motion information indicating the motion corresponding to the captured image acquired after the message is displayed.
  • the operation information may be information indicating an operation such as compression.
  • the operation information may be information indicating operations such as before and after administration of a drug, before and after application of a drug, before and after application of an electrical stimulus, before and after heating, before and after cooling, changes in indoor temperature and humidity, changes in oxygen concentration, before and after irradiation with light of a specific wavelength, before and after administration of an allergen, etc.
  • the operation information may be the amount of operation during shooting when capturing a video.
  • the motion information acquisition unit 230A may control the camera device 210A to display a message to capture an image of the patient's nail, display a message to compress the nail while the image of the nail is captured, start capturing video images before and after compression, release the compression, and continue capturing video images after the compression is released.
  • the motion information acquisition unit 230A acquires motion information indicating the action of compressing a part of the skin, and the terminal device 200A can transmit the video image when the part of the skin is compressed and the motion information to the bioinformation estimation device 100C.
  • the motion information acquisition unit 230A may also measure the time it takes for the color of the nail to return to before the image was captured, and acquire this as motion information.
  • the biometric information estimation unit 130C estimates the biometric information based on the motion information and color information acquired from the motion information acquisition unit 230A. Specifically, the motion information acquisition unit 230A transmits motion information indicating the motion of compressing and releasing the nail to the terminal device 200A, and the camera device 210A transmits a captured image capturing the process of compressing the nail to releasing it to the biometric information estimation device 100C. The color information acquisition unit 120C acquires color information from the captured image, and the biometric information estimation unit 130C estimates the biometric information from the motion information and color information.
  • bioinformation estimation device 100C in addition to the captured image of the patient, motion information is acquired, and the patient's bioinformation can be estimated based on the corrected color information and motion information.
  • Third embodiment 7 is a block diagram showing a configuration example of an inspection system 1D according to the third embodiment.
  • the testing system 1D is a system that transmits a captured image of a reagent and reagent identification information to a testing device 100D that is located, for example, remotely from a reagent user, and the testing device 100D acquires color information from the captured image and estimates a test value with high accuracy.
  • the reagent in the embodiment is, for example, a urine test strip soaked in a patient's urine, but is not limited to this and may be any known reagent whose test value can be read by its color.
  • the inspection system 1D includes, for example, an inspection device 100D and a terminal device 200B.
  • the inspection device 100D and the terminal device 200D are connected, for example, via a communication network.
  • the inspection device 100D and the terminal device 200D have a communication interface, such as a NIC (Network Interface Card) or a wireless communication module, for connecting to a network such as the Internet.
  • the communication network may include, for example, a general-purpose network such as the Internet, and a private network such as local 5G or WiFi (registered trademark).
  • the terminal device 200D is an information processing device such as a smartphone or personal computer used by a reagent user.
  • the terminal device 200D includes, for example, a camera device 202D, a processing unit 204D, an operation unit 206D, and a display unit 208D.
  • the camera device 202D operates based on an operation received by the terminal device 200D, and generates an image of the reagent.
  • the processing unit 204 is a processor that executes application software to perform various processes and controls.
  • the operation unit 206D is an operation interface such as a touch panel or buttons.
  • the display unit 208D is a liquid crystal display or the like.
  • the terminal device 200D and the testing device 100D are separate entities, but this is not limited thereto, and the functions of the testing device 100D may be built into the terminal device 200D.
  • the processing unit 204D may read a three-dimensional code or the like written on the reagent using the camera device 202D, and use image recognition processing to determine reagent identification information such as the reagent manufacturer.
  • the processing unit 204D may generate reagent identification information in response to receiving an operation to input information about the reagent using the operation unit 206D. If the processing unit 204D is set to execute dedicated application software corresponding to the reagent, when the application software is executed and an image is acquired, the captured image and reagent identification information may be transmitted to the inspection device 100D.
  • the processing unit 204D may recognize AR markers, register marks, and the like, and generate reagent identification information.
  • the processing unit 204D may predict the degree of deterioration of the paper based on the captured image, and check whether the reagent has expired.
  • the camera device 202D may be a camera built into a smartphone, a single-lens reflex camera, a web camera, etc.
  • the terminal device 200D may transmit information indicating the type of camera together with the captured image.
  • the captured image may be a still image of the reagent or a moving image of the reagent.
  • the captured image may be, for example, a plurality of images generated by taking interval photographs every elapsed second in order to observe the reaction of the reagent.
  • the captured image may be a plurality of images captured by changing the subject distance, the presence or absence of a flash, the angle, etc.
  • the captured image may be an image captured under a fixed imaging environment (lighting, angle of view) depending on the case.
  • the captured image may be an image of a reference object such as a color chart for color correction, a white board, etc., together with an image of the reagent.
  • the captured image may include both an image including the reagent and an image of the reference object.
  • the captured image may include depth information measured using a depth sensor or the like.
  • the inspection device 100D is an information processing device that communicates with other devices and performs various processes.
  • the inspection device 100D includes, for example, an acquisition unit 110D, a color information acquisition unit 120D, a test value estimation unit 130D, a health condition estimation unit 140D, a display information output unit 150D, and a memory unit 160D.
  • the acquisition unit 110D, the color information acquisition unit 120D, the test value estimation unit 130D, the health condition estimation unit 140D, and the display information output unit 150D are functional units that are realized by a processor such as a CPU (Central Processing Unit) executing a program stored in a program memory.
  • the memory unit 160D is configured by any combination of storage media such as a HDD (Hard Disk Drive), SSD (Solid State Drive), etc.
  • the acquisition unit 110D functions as an image acquisition unit that acquires an image of the reagent, and as a reagent identification information acquisition unit that acquires reagent identification information for identifying the reagent.
  • the color information acquisition unit 120D corrects the color information contained in the captured image and acquires the corrected color information.
  • the color information is information indicating the color of the reagent for estimating the test value.
  • the color information may be RGB values with white balance correction, RGB values reproduced in a specific color space such as sRGB values, etc.
  • the color information may be spectral reflectance, XYZ values in the XYZ color system, L*a*b* values in the L*a*b* color system, erythema index, melanin index, absorbance, spectrum of a specific wavelength, a representative value of a specific area of the subject, or time-series information indicating the change over time of an image when a video with a value for each pixel or interval shooting is performed, and may further include two or more of these pieces of information.
  • the color information may be a single value such as an average value in the area to be extracted, or may be image data (two-dimensional distribution of color information) of a small area including the area to be extracted.
  • the color information acquisition unit 120D includes, for example, an image preprocessing unit 122D, an image color correction unit 124D, and a reagent color extraction unit 126D.
  • the image preprocessing unit 122D performs preprocessing on the captured image.
  • the preprocessing is a process prior to color correction, and may include shading (illumination unevenness) correction, specular reflection correction, white balance estimation, etc.
  • the preprocessing may include a process for smoothly extracting the color of the reagent (keystone correction, a process for identifying the reagent portion of the image and cutting out a rectangle).
  • the image color correction unit 124D corrects color information contained in the captured image that has been subjected to preprocessing, thereby obtaining color-corrected image information having accurate reagent colors.
  • the reagent color extraction unit 126D extracts the color of the reagent based on the corrected color information. In the case of a reagent having multiple test areas as shown in FIG. 7, color information is extracted for each area. In the case where the reaction time differs for each reagent and the recommended test time differs, the reagent color extraction unit 126D extracts color information from images captured at each recommended test time among multiple images captured at intervals. In the case of a video, color information is extracted from a frame at the recommended time. The reagent color extraction unit 126D may detect a pattern (submersion) of the reagent instead of a color. The reagent color extraction unit 126D may identify a reagent based on the extracted color.
  • the test value estimation unit 130D may input the captured image and reagent identification information acquired by the acquisition unit 110D to a machine learning engine that has learned the captured image of the reagent, the reagent identification information, and the test value, and output the test value based on the output from the machine learning engine. Parameters for realizing the machine learning engine may be stored in the storage unit 160D as a learned engine for each reagent. During testing, the test value estimation unit 130D can read out, for example, a learned model corresponding to the reagent identification information from the storage unit 160D, input color information to the learned model, and estimate the test value based on the output of the learned model.
  • the health condition estimation unit 140D estimates the health condition of the reagent user based on the test values estimated by the test value estimation unit 130D.
  • the health condition estimation unit 140D may estimate a disease name as the health condition.
  • the display information output unit 150D outputs to the terminal device 200D the test values estimated by the test value estimation unit 130D and display information for displaying the health condition estimated by the health condition estimation unit 140D.
  • the memory unit 160D stores information processed by the acquisition unit 110D, the color information acquisition unit 120D, the test value estimation unit 130D, the health condition estimation unit 140D, and the display information output unit 150D.
  • the reagent identification information may be any information that identifies a reagent, for example, information that identifies the type of reagent.
  • the test value estimation unit 130D estimates the test value of the reagent based on the type and color information of the reagent.
  • FIG. 8 is a sequence diagram showing an example of a processing procedure of the inspection system 1D in the third embodiment.
  • the terminal device 200D captures an image of the reagent user using the camera device 202D, transmits the captured image to the testing device 100D, and transmits reagent identification information to the testing device 100D.
  • the acquisition unit 110D acquires the captured image
  • the image preprocessing unit 122D performs preprocessing on the captured image and outputs the preprocessed image to the image color correction unit 124D.
  • the image color correction unit 124D acquires information necessary for correction from the storage unit 160D and corrects the color information of the preprocessed image.
  • the image color correction unit 124D outputs the color-corrected image to the reagent color extraction unit 126.
  • the reagent color extraction unit 126D extracts the color of the reagent from the color-corrected image and outputs the color information to the test value estimation unit 130D.
  • the health condition estimation unit 140D estimates the health condition based on the test value, and the display information output unit 150D generates display information including the test value and the health condition and transmits it to the terminal device 200D. If test value estimation unit 130D fails to detect the color of the reagent or if health condition estimation unit 140D fails to test the health condition, etc., test value estimation unit 130D may cause display information output unit 150D to output an error result to terminal device 200D. Display information output unit 150D may generate display information prompting re-capture of the captured image together with the error result and transmit the display information to terminal device 200D.
  • FIG. 9 is a diagram showing an example of a display screen of the terminal device 200D in the third embodiment.
  • the terminal device 200D first starts the application software to display the initial screen (user page) shown in Fig. 9(A).
  • the terminal device 200D displays the examination setting screen shown in Fig. 9(B).
  • the test setting screen in Fig. 9(B) allows the user to select a test strip (reagent) from a drop-down list, and includes test items corresponding to the reagent.
  • the terminal device 200D displays the image capture adjustment screen in Fig. 9(C).
  • the terminal device 200D also generates reagent identification information corresponding to the selected reagent. 9(C), for example, a message saying "Please make sure the test paper fits within the frame" and a guide frame are superimposed on the front of the camera image.
  • the terminal device 200D obtains the camera image within the frame as a captured image, and transmits the captured image and reagent identification information to the testing device 100D.
  • Terminal device 200D receives the display information from inspection device 100D, and displays the inspection result screen of FIG. 9(D) or the inspection failure screen of FIG. 9(E) based on the display information.
  • the 9(D) includes a reagent image as a photographed result, the test result (numerical value, health condition), a reference image, a re-photograph button, and a result save button. This allows the reagent user to know the test value and health condition.
  • the terminal device 200D displays the photographing adjustment screen in FIG. 9(C).
  • the terminal device 200D displays the initial screen in FIG. 9(A).
  • the terminal device 200D may transmit the test result, etc. to other application software or a server device to provide a service linked to the test result.
  • the test failure screen in FIG. 9(E) includes error information, a cancel button, and a re-photograph button. This allows the terminal device 200D to prompt the user to re-photograph the reagent.
  • FIG. 10 is a diagram showing an example of the image color correction unit 124D in the third embodiment.
  • the image color correction unit 124D of the color information acquisition unit 120D may include at least one of a correction unit 124a, a conversion unit 124b, an estimation unit 124c, and an estimation unit 124d.
  • the correction unit 124a, the conversion unit 124b, the estimation unit 124c, and the estimation unit 124d are functional units realized by a processor such as a CPU executing a program stored in a program memory.
  • the correction unit 124a estimates the lighting color of the space in which the reagent was imaged based on the captured image, and corrects the white balance of the RGB information contained in the captured image. Note that it is desirable for the captured image to have an auxiliary background such as white, and a message encouraging the user to take a picture in an environment that includes a white background may be displayed when taking a picture with the camera device 202D.
  • the correction unit 124a outputs the captured image including color information with the white balance corrected to the reagent color extraction unit 126. Note that the correction unit 124a may change the correction process depending on the type of reagent.
  • the acquisition unit 110D acquires a color chart image of a color chart and an image of a reagent
  • the conversion unit 124b creates conversion information for converting color information included in the image of the reagent using the color chart image, and converts the color information included in the image based on the conversion information.
  • the color chart is, for example, a plate-shaped object on which color samples are arranged. Note that, instead of the color chart, an image of the reagent before impregnation may be acquired and used as a color chart or reference.
  • the conversion unit 124b creates a color conversion matrix (conversion information) based on the relationship between the color indicated by the color chart and the color of the color chart included in the image.
  • the conversion unit 124b creates a color conversion matrix using the color chart, for example, by performing multiple regression analysis.
  • the conversion unit 124b can convert the image of the reagent based on the conversion information to convert it into the color indicated by the color chart.
  • the conversion unit 124b outputs the image of the converted color information to the reagent color extraction unit 126D. Note that the conversion unit 124b may create conversion information for each type of reagent.
  • the estimation unit 124c estimates color information based on output information of a machine learning engine that estimates parameters that affect the color of a captured image when a reagent is captured.
  • the parameters that affect the color of a captured image are, for example, spectral reflectance and absorbance.
  • the machine learning engine is a parameter estimation engine that is trained using the captured image and parameters (spectral reflectance and absorbance) as learning data.
  • the parameter estimation engine estimates parameters in response to input of the captured image.
  • the estimation unit 124c estimates color information of the captured image using the parameters estimated by the parameter estimation engine.
  • the estimation unit 124c may generate other parameters using the absorbance estimated by the parameter estimation engine.
  • the estimation unit 124c estimates color information of the captured image using other parameters generated using the parameters estimated by the parameter estimation engine.
  • the estimation unit 124d estimates correct color information based on the captured image acquired by the acquisition unit 110D and color information about the camera that captured the captured image, such as the spectral sensitivity.
  • the terminal device 200 transmits camera information related to color information, such as the type and specifications (sensitivity, etc.) of the camera device 202D, along with the captured image to the inspection device 100D.
  • the estimation unit 124d estimates the color information of the reagent user based on the camera information included in the captured image.
  • an image of a reagent user is acquired, color information contained in the captured image is corrected, and biological information of the reagent user is estimated based on the corrected color information. Therefore, even if the captured image varies depending on the camera device 202D that captured the image and the environment around the reagent user, color information can be acquired from the captured image and biological information can be estimated with high accuracy by correcting the color information.
  • FIG. 11 is a block diagram showing an example of an inspection system 1E according to the third embodiment.
  • the testing system 1E includes a biological information detection unit 210D that acquires biological detection information detected from a reagent user.
  • the biological information detection unit 210D may be a device that measures biological information of a reagent user, such as a smart watch, or may be a device that receives an operation of a reagent user to input biological information.
  • the biological information detected by the biological information detection unit 210D is biological information that assists the health condition estimation unit 140E in estimating a health condition.
  • the biological information detected by the biological information detection unit 210D may be personal information such as gender, height, weight, age, medical history, etc., detection values of biological sensors such as body temperature, oxygen saturation, pulse, blood pressure, electrocardiogram, etc., or environmental information detected by an environmental sensor such as temperature, humidity, time, etc.
  • environmental information that may cause a change in the color of a reagent may be transmitted to the testing device 100E to assist the test value estimation unit 130E in estimating the test value.
  • the biometric information detection unit 210D may be connected to a server device for communication and may acquire blood test results (hemoglobin), urine test results, various test results (melanin, etc.), medical history, etc. from the server device.
  • the biometric information detection unit 210D may estimate biometric information from captured images acquired by the camera device 202D through internal processing, or from captured images acquired by other means.
  • the biometric information detection unit 210D may estimate, for example, burns, trauma, melanin, etc. from captured images.
  • the test value estimation unit 130E estimates the test value based on the environmental information indicating the environment in which the reagent was used, color information, and reagent identification information.
  • the health condition estimation unit 140E estimates the health condition based on the biometric information and test value acquired from the biometric information detection unit 210D.
  • the biodetection information obtained by the bioinformation detection unit 210D is acquired, and the health condition can be estimated based on the corrected color information and the bioinformation. Furthermore, according to the testing device 100E, the test value can be estimated with high accuracy based on the environmental information indicating the environment in which the reagent was used, the color information, and the reagent identification information.
  • the terminal device and biometric information estimation device in the above-mentioned embodiment may be realized by a computer.
  • a program for realizing this function may be recorded on a computer-readable recording medium, and the program recorded on the recording medium may be read into a computer system and executed to realize the function.
  • computer system here includes hardware such as an OS and peripheral devices.
  • computer-readable recording medium refers to portable media such as flexible disks, optical magnetic disks, ROMs, and CD-ROMs, and storage devices such as hard disks built into a computer system.
  • the term "computer-readable recording medium” may include a medium that dynamically holds a program for a short period of time, such as a communication line when a program is transmitted via a network such as the Internet or a communication line such as a telephone line, and a medium that holds a program for a certain period of time, such as a volatile memory inside a computer system that is a server or client in such a case.
  • the above-mentioned program may be a program for realizing part of the above-mentioned function, or may be a program that can realize the above-mentioned function in combination with a program already recorded in the computer system, or may be a program that is realized using a programmable logic device such as an FPGA (Field Programmable Gate Array).
  • a programmable logic device such as an FPGA (Field Programmable Gate Array).
  • part is used to explain the configuration of the bioinformation estimation device in the embodiment, but the term “part” may be replaced with other terms that represent the technical configuration, structure, function, action, etc. of the bioinformation estimation device, such as a circuit, unit, module, or component.
  • terms such as portion, circuit, unit, module, component, etc. may refer to modules or components that may be stored and/or executed by general-purpose hardware (e.g., computer-readable media, processor devices, etc.) of a computer system, and/or hardware configured to execute software objects or software routines.
  • the embodiment can also be expressed as follows.
  • (1-1) an image acquisition unit for acquiring an image of a living body; a color information acquisition unit that corrects color information included in the captured image and acquires the corrected color information; a biometric information estimation unit that estimates biometric information of the living body based on the color information;
  • a biological information estimation device comprising: (1-2) The biometric information estimation device described in (1-1), wherein the color information acquisition unit estimates the lighting color of the space in which the biometric subject is imaged based on the captured image, and corrects the white balance of the RGB information contained in the captured image.
  • the image acquisition unit acquires a color chart image obtained by capturing an image of a color chart and a captured image obtained by capturing an image of a living body;
  • the color information acquisition unit creates conversion information for converting color information contained in an image captured of a living body using the color chart image, and converts the color information contained in the captured image based on the conversion information.
  • the color information acquisition unit estimates color information based on output information of a machine learning engine that estimates parameters that affect the color of an image captured when a living body is imaged.
  • (1-5) The biological information estimation device according to (1-1), wherein the color information acquisition unit estimates color information of the biological body based on camera information included in the captured image.
  • a motion information acquisition unit that acquires motion information of the living body when the captured image is captured, the color information acquisition unit acquires color information of the living body based on the captured image and the motion information;
  • the image acquisition unit acquires a captured image of a patient as the living body,
  • the color information acquisition unit corrects the color information of the patient
  • the biological information estimation unit estimates biological information of the patient.
  • the biological information estimation device according to (1-1).
  • (1-10) A step in which the biological information estimation device acquires an image of a living body; a step of the biological information estimation device correcting color information included in the captured image and acquiring corrected color information; a step of estimating biological information of the living body based on the color information by the biological information estimation device;
  • a biological information estimation method comprising: (1-11) The computer of the information processing device acquiring an image of a living body; correcting color information included in the captured image and acquiring corrected color information; estimating biometric information of the living body based on the color information; A program to execute.
  • an image acquisition unit for acquiring an image of the reagent; a reagent identification information acquisition unit that acquires reagent identification information for identifying the reagent; a color information acquisition unit that corrects color information included in the captured image and acquires the corrected color information; an inspection value estimation unit that estimates an inspection value of the reagent based on the reagent identification information and the color information;
  • An inspection device comprising: (2-2) The reagent identification information is information for identifying a type of the reagent, the test value estimation unit estimates the test value of the reagent based on the type of the reagent and the color information. An inspection device as described in (2-1).
  • the testing device (2-3) The testing device according to (2-1)1, wherein the color information acquisition unit estimates the illumination color of the space in which the reagent is imaged based on the captured image, and corrects the white balance of the RGB information contained in the captured image.
  • the image acquisition unit acquires a color chart image obtained by capturing an image of a color chart and an image of a reagent;
  • the testing device described in (2-1) wherein the color information acquisition unit creates conversion information for converting color information contained in an image captured of a reagent using the color chart image, and converts the color information contained in the captured image based on the conversion information.
  • the test value estimation unit inputs the captured image acquired by the image acquisition unit and the reagent identification information acquired by the reagent identification information acquisition unit to a machine learning engine that has learned the captured image of the reagent, the reagent identification information, and the test value, and outputs the test value based on the output from the machine learning engine.
  • a biological information detection unit for acquiring biological information detected from a user of the reagent, the health condition estimation unit estimates the health condition based on the biological information acquired by the biological information detection unit, the test value estimated by the test value estimation unit, and the reagent identification information;
  • a method for testing comprising: (2-12) The computer of the information processing device acquiring an image of the reagent; acquiring reagent identification information for identifying the reagent; correcting color information included in the captured image and acquiring corrected color information; estimating an inspection value of the reagent based on the reagent identification information and the color information; A program to execute.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Pathology (AREA)
  • Chemical & Material Sciences (AREA)
  • Biomedical Technology (AREA)
  • Public Health (AREA)
  • Medical Informatics (AREA)
  • Biochemistry (AREA)
  • Biophysics (AREA)
  • General Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Immunology (AREA)
  • Epidemiology (AREA)
  • Analytical Chemistry (AREA)
  • Primary Health Care (AREA)
  • Hematology (AREA)
  • Medicinal Chemistry (AREA)
  • Plasma & Fusion (AREA)
  • Radiology & Medical Imaging (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Urology & Nephrology (AREA)
  • Food Science & Technology (AREA)
  • Chemical Kinetics & Catalysis (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Image Analysis (AREA)

Abstract

Un aspect de la présente divulgation concerne un dispositif d'inférence d'informations biologiques comprenant : une unité d'acquisition d'image qui acquiert une image capturée obtenue par imagerie d'un sujet ; une unité de prétraitement qui effectue un prétraitement prescrit sur l'image capturée ; une unité d'extraction qui extrait des informations de couleur du sujet à partir de l'image capturée soumise au prétraitement effectué par l'unité de prétraitement ; une unité d'inférence qui infère, sur la base des informations de couleur et des informations de sujet indiquant des informations sur le sujet, une valeur d'informations biologiques qui est une valeur indiquant des informations biologiques ; et une unité de sortie qui délivre la valeur d'informations biologiques. Le prétraitement dans l'unité de prétraitement comprend un traitement de correction de couleur pour corriger la couleur du sujet inclus dans l'image capturée de façon à être proche de la couleur réelle du sujet.
PCT/JP2024/041090 2023-11-28 2024-11-20 Dispositif d'inférence d'informations biologiques, procédé d'inférence d'informations biologiques et programme Pending WO2025115714A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2023-200846 2023-11-28
JP2023200846 2023-11-28
JP2023200410 2023-11-28
JP2023-200410 2023-11-28

Publications (1)

Publication Number Publication Date
WO2025115714A1 true WO2025115714A1 (fr) 2025-06-05

Family

ID=95896525

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2024/041090 Pending WO2025115714A1 (fr) 2023-11-28 2024-11-20 Dispositif d'inférence d'informations biologiques, procédé d'inférence d'informations biologiques et programme

Country Status (1)

Country Link
WO (1) WO2025115714A1 (fr)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010281728A (ja) * 2009-06-05 2010-12-16 Nippon Telegr & Teleph Corp <Ntt> ガス濃度測定装置及びガス濃度測定方法
JP2015010943A (ja) * 2013-06-28 2015-01-19 株式会社デンケン 呈色測定装置および呈色測定プログラム
JP2016505301A (ja) * 2012-12-10 2016-02-25 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. カメラ−プロジェクタ・システムを用いてアクシデントの最中にヘモグロビンレベルを測定する医療装置又はシステム
JP2021058361A (ja) * 2019-10-04 2021-04-15 株式会社日立製作所 生体情報取得装置およびプログラム
JP2021177181A (ja) * 2013-01-07 2021-11-11 アイセンサー・カンパニー・リミテッドIxensor Co., Ltd. 試験紙及び試験紙の読取方法
JP2022001824A (ja) * 2018-09-27 2022-01-06 株式会社トライアンドイー 赤外線熱画像の補正表示方法及びこれを用いた判定方法
JP2023040854A (ja) * 2021-09-10 2023-03-23 凸版印刷株式会社 分光情報画像推定装置、分光情報画像推定方法及びプログラム

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010281728A (ja) * 2009-06-05 2010-12-16 Nippon Telegr & Teleph Corp <Ntt> ガス濃度測定装置及びガス濃度測定方法
JP2016505301A (ja) * 2012-12-10 2016-02-25 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. カメラ−プロジェクタ・システムを用いてアクシデントの最中にヘモグロビンレベルを測定する医療装置又はシステム
JP2021177181A (ja) * 2013-01-07 2021-11-11 アイセンサー・カンパニー・リミテッドIxensor Co., Ltd. 試験紙及び試験紙の読取方法
JP2015010943A (ja) * 2013-06-28 2015-01-19 株式会社デンケン 呈色測定装置および呈色測定プログラム
JP2022001824A (ja) * 2018-09-27 2022-01-06 株式会社トライアンドイー 赤外線熱画像の補正表示方法及びこれを用いた判定方法
JP2021058361A (ja) * 2019-10-04 2021-04-15 株式会社日立製作所 生体情報取得装置およびプログラム
JP2023040854A (ja) * 2021-09-10 2023-03-23 凸版印刷株式会社 分光情報画像推定装置、分光情報画像推定方法及びプログラム

Similar Documents

Publication Publication Date Title
US11266345B2 (en) Apparatus for visualization of tissue
US10285624B2 (en) Systems, devices, and methods for estimating bilirubin levels
JP6545658B2 (ja) ビリルビンレベルを推定すること
US20170164888A1 (en) Organ imaging device
US20150044098A1 (en) Hyperspectral imaging systems, units, and methods
KR200470398Y1 (ko) 소변검사용 스트립
JP7062926B2 (ja) 呈色反応検出システム、呈色反応検出方法及びプログラム
CN114269231A (zh) 使用一组机器学习诊断模型来确定基于患者肤色的诊断
US20170311872A1 (en) Organ image capture device and method for capturing organ image
US20220095998A1 (en) Hyperspectral imaging in automated digital dermoscopy screening for melanoma
US12368812B2 (en) Image display system and image display method
CN107072546A (zh) 健康度输出装置、健康度输出系统以及程序
JP2023502369A (ja) 体液中の分析物の分析的決定のための調整方法
JP2020178903A (ja) 皮膚内傷害検査装置、皮膚内傷害検査システム
CN106462926A (zh) 健康度判定装置以及健康度判定系统
WO2025115714A1 (fr) Dispositif d&#39;inférence d&#39;informations biologiques, procédé d&#39;inférence d&#39;informations biologiques et programme
JP2016198140A (ja) 器官画像撮影装置
US20160210746A1 (en) Organ imaging device
KR20220041770A (ko) 스마트 디바이스 기반 생체정보 측정 시스템
JP2022000623A (ja) カラーチャート
JP2005094185A (ja) 画像処理システム、画像処理装置、および撮像制御方法
US20160242678A1 (en) Organ image photographing apparatus
JP2024019863A (ja) 健康情報評価システム及び健康情報評価方法
US20250000237A1 (en) Advanced skyn
WO2025004840A1 (fr) Dispositif de traitement d&#39;informations, procédé de traitement d&#39;informations et programme

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24897400

Country of ref document: EP

Kind code of ref document: A1