[go: up one dir, main page]

WO2023074823A1 - Dispositif d'acquisition de bruits du coeur, système d'acquisition de bruits du coeur, procédé d'acquisition de bruits du coeur et programme - Google Patents

Dispositif d'acquisition de bruits du coeur, système d'acquisition de bruits du coeur, procédé d'acquisition de bruits du coeur et programme Download PDF

Info

Publication number
WO2023074823A1
WO2023074823A1 PCT/JP2022/040255 JP2022040255W WO2023074823A1 WO 2023074823 A1 WO2023074823 A1 WO 2023074823A1 JP 2022040255 W JP2022040255 W JP 2022040255W WO 2023074823 A1 WO2023074823 A1 WO 2023074823A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
auscultation
user
heart sound
sound acquisition
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2022/040255
Other languages
English (en)
Japanese (ja)
Inventor
貴之 内田
知紀 八田
亮 市川
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Terumo Corp
Original Assignee
Terumo Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Terumo Corp filed Critical Terumo Corp
Publication of WO2023074823A1 publication Critical patent/WO2023074823A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording for evaluating the cardiovascular system, e.g. pulse, heart rate, blood pressure or blood flow
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B7/00Instruments for auscultation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B7/00Instruments for auscultation
    • A61B7/02Stethoscopes
    • A61B7/04Electric stethoscopes

Definitions

  • the present disclosure relates to a heart sound acquisition device, a heart sound acquisition system, a heart sound acquisition method, and a program.
  • heart sound sensor that measures heart sounds is worn at the same position every time because the output waveform differs depending on the position where the sensor is worn.
  • the heart sound sensor since the heart sound sensor is to be measured at home, it is desirable that it be of a form that is easy for the patient to wear.
  • an electronic listening system includes a position acquisition unit that acquires position information indicating the position of a heart sound sensor abutted against the body to auscultate a patient's heart sounds (see, for example, Patent Reference 1).
  • the position information of the body in contact with the heart sound sensor is recorded at the time of examination by a doctor, and when the user obtains the heart sound from the next time onward, the heart sound is detected based on the recorded position information.
  • Guidance information can be provided to the user to guide the sensor.
  • Patent Document 1 proposes a method of using a laser, a projector, a head-mounted display, or the like to guide the heart sound sensor to the correct listening position.
  • a third party such as a doctor visually determines the position of the heart sound sensor.
  • an object of the present disclosure which focuses on these points, is to provide a heart sound acquisition device, a heart sound acquisition system, and a heart sound acquisition that enable a user to easily position an auscultation unit at a predetermined position to acquire heart sounds.
  • the object is to provide a method and a program.
  • a heart sound acquisition device includes an auscultation unit configured to acquire heart sounds of a user, an imaging unit that captures an image of the user's body including characteristic points, and a storage unit for storing the relative position of the position where the auscultation unit should be arranged; a display unit for displaying the image captured by the imaging unit; and the characteristic points of the user from the image captured by the imaging unit. and calculating the position where the auscultation unit should be placed based on the position of the feature point and the relative position stored in the storage unit, and displaying the image on the display unit and a control unit for guiding the auscultation unit to a position where the auscultation unit should be placed.
  • the auscultation unit includes a marker
  • the control unit recognizes the position of the auscultation unit on the image by detecting the marker from the image captured by the imaging unit.
  • control unit determines at least one of a distance between the imaging unit and the body of the user and an orientation of the body of the user based on the image captured by the imaging unit. configured to be recognizable.
  • the heart sound acquisition device further includes a communication unit configured to be able to communicate with a server, and the control unit transmits the relative position stored in the storage unit to the Configured to retrieve from the server.
  • the heart sound acquisition device further includes a speaker, and the control unit uses the sound from the speaker when guiding the auscultation unit to a position where the auscultation unit should be placed.
  • the control unit when guiding the auscultation unit to a position where the auscultation unit should be placed, the control unit superimposes at least one of characters and graphics on the image captured by the image capture unit, and display on the display.
  • the auscultation unit includes a pressure sensor that detects pressure between the body of the user and the control unit, based on the pressure detected by the pressure sensor, controls the pressure of the auscultation unit. A contact state of the user with respect to the body is determined.
  • the heart sound acquisition device includes a notification unit that notifies that the auscultation unit is positioned at a position where the auscultation unit should be placed.
  • the heart sound acquisition device includes a conversion unit that converts the heart sound acquired by the auscultation unit into an electrical signal.
  • control unit determines a contact state of the auscultation unit with respect to the body of the user based on the waveform of the electrical signal obtained by converting the heart sounds by the conversion unit.
  • control unit is configured to analyze the waveform of the electrical signal into which the heart sounds are converted by the conversion unit.
  • the heart sound acquisition device further includes a physical information acquisition unit configured to be capable of acquiring physical information other than the heart sounds of the user, and the control unit controls the heart sounds converted by the conversion unit.
  • the waveform of the electrical signal and the waveform of the physical information acquired by the physical information acquisition unit are temporally synchronized.
  • the physical information includes at least one of an electrocardiogram and a pulse wave
  • the control unit controls the heart sound of the user acquired from the auscultation unit and the heart sound acquired from the physical information acquisition unit.
  • a hemodynamic parameter is calculated based on the physical information of the user.
  • a heart sound acquisition system includes a server that stores a relative position where an auscultation unit configured to acquire a user's heart sound with respect to a feature point of the user should be placed; the auscultation unit; an imaging unit that captures an image of a body including characteristic points, a display unit that displays the image captured by the imaging unit, a communication unit that acquires the relative position from the server, and an image captured by the imaging unit recognizing the position of the feature point of the user from the image, calculating the position where the auscultation unit should be arranged based on the position of the feature point and the relative position obtained from the server, and displaying the display; a heart sound acquisition device including a control unit for guiding the auscultation unit to a position where the auscultation unit should be placed on the image displayed on the unit.
  • a heart sound acquisition method is a heart sound acquisition method executed by a control unit to acquire heart sounds using an auscultation unit configured to be capable of acquiring heart sounds of a user.
  • a position where the auscultation unit should be arranged is calculated based on the position where the auscultation unit should be arranged relative to the feature points, and the auscultation unit is displayed on the image displayed on the display unit. to the position where it should be placed.
  • a program as one aspect of the present disclosure is a program for acquiring heart sounds using an auscultation unit configured to be capable of acquiring heart sounds of a user, wherein an imaging unit captures an image of the user's body including characteristic points. a process of recognizing the position of the feature point of the user from the image captured by the imaging unit, the position of the feature point, and the auscultation of the feature point of the user stored in a storage unit a process of calculating a position where the auscultation unit should be arranged based on the relative position of the position where the auscultation unit should be arranged; A processor provided in the heart sound acquisition device executes a process of guiding to the desired position.
  • the user can easily position the auscultation unit at a predetermined position to acquire heart sounds.
  • FIG. 1 is a schematic configuration diagram showing an example of a hemodynamic monitoring system including a heart sound acquisition device according to one embodiment.
  • FIG. 2 is a diagram for explaining an example of usage of the hemodynamic monitoring system of FIG.
  • FIG. 3 is a schematic configuration diagram showing an example of the heart sound acquisition device of FIG. 4 is a perspective view showing an example of the appearance of the main body shown in FIG. 3.
  • FIG. 5 is a functional block diagram showing an example of a control unit in FIG. 3; 6 is a perspective view showing an example of the appearance of the auscultation unit of FIG. 3.
  • FIG. FIG. 7 is a flow chart for determining the position of the auscultation unit in a medical institution.
  • FIG. 1 is a schematic configuration diagram showing an example of a hemodynamic monitoring system including a heart sound acquisition device according to one embodiment.
  • FIG. 2 is a diagram for explaining an example of usage of the hemodynamic monitoring system of FIG.
  • FIG. 3 is a schematic configuration diagram showing an example of the heart sound acquisition device of FIG.
  • FIG. 8 is a diagram illustrating a method of determining the position of an auscultation unit in a medical institution.
  • FIG. 9 is a flowchart for explaining the procedure for calculating hemodynamic parameters.
  • FIG. 10 is a flow chart for explaining the process of placing the auscultation unit in FIG.
  • FIG. 11 is a diagram illustrating an example of a method of arranging the auscultation unit when the user is at home.
  • the hemodynamic monitoring system 10 is a system for remotely monitoring the condition of a user who is a heart failure patient discharged from a medical institution.
  • the hemodynamic monitoring system 10 acquires the user's electrocardiogram, pulse wave, and heart sound data, and analyzes the patient's hemodynamics.
  • Hemodynamic parameters indicating hemodynamics include left ventricular pressure and pulmonary artery pressure.
  • Hemodynamic monitoring is performed in cooperation with the medical institution and the user, for example, according to the procedure shown in Figure 2.
  • a physician treating a heart failure patient decides to perform remote monitoring of a calm patient (user).
  • a doctor at a medical institution specifies the position where the auscultation unit should be placed when acquiring the patient's heart sound, and stores it in the storage unit of the monitoring device or the server.
  • the user borrows home equipment for hemodynamic monitoring from a medical institution or rental company, and measures heart sounds, electrocardiograms, and pulse waves at home.
  • Heart sounds, electrocardiograms and pulse waves are included in physical information. For example, the measurement is performed periodically at a time determined by a doctor every day.
  • Heart sounds are measured by the user placing the auscultation unit at a position stored in the device or server by the doctor. The user sends the measurement results to the medical institution with the home device.
  • the user receives the prescription from the doctor and takes medicine based on the changed prescription.
  • the hemodynamic monitoring system 10 monitors changes in the state of the heart failure patient and, if necessary, quickly changes the prescription or the like, thereby reducing the possibility that the user's condition will worsen and the user will be re-hospitalized. can be reduced.
  • the hemodynamic monitoring system 10 includes a main unit 11 arranged on the user side, an auscultation unit 12 connected to the main unit 11, electrodes 13 and an arm band device 14, and a main unit. 11 and a medical institution system 16 that can communicate with each other via a communication network 17 .
  • the medical institution system 16 is an information system within the medical institution and includes computers such as servers within the medical institution.
  • the electrodes 13 and the arm band device 14 for acquiring physical information other than heart sounds, ie, electrocardiogram and pulse wave information, are included in the physical information acquiring section.
  • the main unit 11 is a computer that acquires and analyzes physical information.
  • the auscultation unit 12 is a sensor that acquires the user's heart sounds, that is, heart sounds.
  • the auscultation unit 12 can be rephrased as a heart sound sensor.
  • the electrodes 13 are electrodes attached to a site such as a wrist and an ankle in order to obtain an electrocardiogram of the user.
  • the electrodes 13 can detect minute electricity generated in the heart.
  • the arm band device 14 wraps an arm band around the user's upper arm or the like and sends air into the arm band to compress blood vessels and measure pulse waves transmitted to the blood vessels due to heartbeats.
  • Main unit 11, auscultation unit 12, electrodes 13, and arm band device 14 are connected by wire or wirelessly so that information can be transmitted and received.
  • the main unit 11 can analyze each waveform of heart sounds, electrocardiograms, and pulse waves.
  • the main unit 11 may store a machine-learned model having heart sounds, an electrocardiogram, and a pulse wave as input data and a patient's hemodynamic parameters as output data. Based on this learned model, the main body 11 may estimate hemodynamic parameters and changes in the patient's condition from the input heart sounds, electrocardiograms, and pulse waves.
  • the hemodynamic monitoring system 10 may be further connected with the server 18.
  • the server 18 may store relative position information regarding the position where the auscultation unit 12 should be placed for each user determined by a doctor at the medical institution.
  • the relative position information is information indicating the relative position of the position where the auscultation unit 12 should be placed with respect to the feature points of the user's body.
  • the main unit 11 may read the relative position information of the auscultation unit 12 from the server 18 when acquiring the user's heart sounds.
  • Server 18 may be located at a location separate from the medical institution or may be internal to the medical institution. Further, when the doctor determines the position where the user's auscultation unit 12 should be placed in the medical institution system 16, the relative position information is stored in the storage unit 22 (see FIG. 3) of the main unit 11 and passed to the user without using the server 18. can be registered directly.
  • a heart sound acquisition device 15 of this embodiment includes a main unit 11 and an auscultation unit 12, which are part of the hemodynamic monitoring system 10 shown in FIG.
  • the heart sound acquisition device 15 is not limited to the hemodynamic monitoring system 10, and can be used for other heart sound acquisition applications.
  • a system including the heart sound acquisition device 15 and the server 18 is called a heart sound acquisition system.
  • the hemodynamic monitoring system 10 of FIG. 1 is a heart sound acquisition system.
  • a more detailed configuration of the main unit 11 and the auscultation unit 12 of the heart sound acquisition device 15 will be described below.
  • main unit 11 includes imaging unit 21 , storage unit 22 , display unit 23 , imaging adjustment unit 24 , control unit 25 , communication unit 26 and speaker 27 . Further, as shown in FIG. 4, the main body portion 11 includes a stand 28 capable of adjusting the orientation of the display portion 23. As shown in FIG. 3
  • the imaging unit 21 is a camera capable of imaging the user's face and upper body. Therefore, the imaging unit 21 may be arranged above or below the display unit 23 .
  • the imaging unit 21 may include a lens and an imaging element.
  • the imaging device is, for example, a CCD image sensor (Charge-Coupled Device Image Sensor) or a CMOS image sensor (Complementary MOS Image Sensor).
  • the storage unit 22 is a memory that stores data required for processing performed in the main unit 11 and data generated in the main unit 11 .
  • the storage unit 22 may store programs executed by the control unit 25, which will be described later.
  • the storage unit 22 may include, for example, one or more of a semiconductor memory, a magnetic memory, an optical memory, and the like.
  • Semiconductor memory may include volatile memory and non-volatile memory.
  • the storage unit 22 may store information on the position where the auscultation unit 12 should be placed.
  • the position where the auscultation unit 12 should be placed is stored as relative position information indicating the relative position with respect to the feature points of the user's body, as will be described later.
  • the control unit 25 estimates the user's hemodynamics by machine learning
  • the storage unit 22 may store a learned model.
  • the display unit 23 displays images under the control of the control unit 25 .
  • the display unit 23 can display an image including at least partially the user's face and upper body imaged by the imaging unit 21, as shown in FIG.
  • a commonly known display can be used for the display unit 23 .
  • the display unit 23 can employ, for example, a liquid crystal display (LCD), an organic EL (Electro-Luminescence) display, an inorganic EL display, a plasma display (PDP: Plasma Display Panel), or the like.
  • the imaging adjustment unit 24 adjusts the imaging direction of the imaging unit 21 under the control of the control unit 25 .
  • the imaging adjustment section 24 may include, for example, a driving section that is incorporated inside the main body section 11 and changes the orientation of the imaging section 21 . Further, the imaging adjustment section 24 may be incorporated in the stand 28 and adjust the orientation of the portion of the main body section 11 including the imaging section 21 .
  • Control section 25 controls each section of the main body section 11, and performs various arithmetic processing for positioning the auscultation section 12 and estimating the user's hemodynamic parameters.
  • Control unit 25 includes one or more processors.
  • the processor includes a general-purpose processor that loads a specific program and executes a specific function, and a dedicated processor that specializes in specific processing. Special purpose processors include Application Specific Integrated Circuits (ASICs) and Programmable Logic Devices (PLDs). PLDs include FPGAs (Field-Programmable Gate Arrays).
  • the control unit 25 may be either SoC (System-on-a-Chip) or SiP (System In a Package) in which one or more processors cooperate.
  • the control unit 25 may include memory built into the processor or memory independent of the processor.
  • the control unit 25 can execute a program that defines control procedures.
  • the control unit 25 may be configured to load and implement a program recorded on a non-transitory computer-readable medium into a memory. Processing performed by the control unit 25 will be further described below with reference to FIG. 5 and the like.
  • the communication unit 26 includes hardware and software for communicating with the medical institution system 16 and the server 18 via the communication network 17.
  • the communication unit 26 corresponds to wired and/or wireless communication means.
  • the communication unit 26 performs processing such as protocol processing related to transmission and reception of information, modulation of transmission signals and demodulation of reception signals.
  • the speaker 27 emits sounds for guiding the user.
  • a known speaker can be used as the speaker 27 .
  • the auscultation unit 12 is a sensor that is brought into contact with the user's own chest to acquire heart sounds.
  • the auscultation unit 12 includes a heart sound acquisition unit 31, a conversion unit 32, a pressure sensor 33, and a vibration unit 34, as shown in FIG.
  • Each component of the auscultation unit 12 may be controlled by the control unit 25 of the main unit 11 .
  • the auscultation section 12 may have a control section (processor) for controlling each component of the auscultation section 12 in cooperation with the control section 25 of the main body section 11 .
  • the main unit 11 and the auscultation unit 12 may be connected by wire or wirelessly.
  • the auscultation unit 12 also has a marker 36 on the surface opposite to the portion that contacts the user's chest, as shown in FIG. Further, the auscultation unit 12 has a handle 37 that the user uses when holding the auscultation unit 12 .
  • the heart sound acquisition unit 31 acquires the user's heart sound.
  • the heart sound acquisition unit 31 can be provided on the side of the auscultation unit 12 that contacts the user's chest.
  • the conversion unit 32 converts the heart sounds acquired by the heart sound acquisition unit 31 into electrical signals.
  • the heart sound acquisition unit 31 and the conversion unit 32 may constitute a heart sound sensor that converts heart sounds into electrical signals.
  • Heart sound sensors include, for example, MEMS (Micro Electro Mechanical Systems) heart sound sensors, heart sound sensors using piezoelectric elements, and heart sound sensors using accelerometers. Note that the heart sound acquisition device of the present disclosure is not limited to this embodiment.
  • the heart sound acquisition device of the present disclosure also includes a form of auscultation unit 12 that does not convert heart sounds into electrical signals. The heart sounds are then transmitted to the physician or user as vibrations of air or objects.
  • the pressure sensor 33 is a sensor that detects the pressure with which the auscultation unit 12 is pressed against the user's chest and outputs it as an electrical signal.
  • a pressure sensor 33 using a piezoelectric effect can be used.
  • the pressure sensor 33 may have, for example, a ring shape along the outer circumference of the surface of the auscultation unit 12 that contacts the chest of the user.
  • the pressure sensors 33 may be provided at a plurality of locations on the outer circumference of the surface of the auscultation unit 12 that contacts the user's chest.
  • the vibrating section 34 includes a vibrator that generates vibrations perceptible to humans.
  • Vibrators include those using eccentric motors, those using linear vibrators, and those using piezoelectric elements.
  • the marker 36 is a mark used to specify the position of the auscultation unit 12 from the image captured by the imaging unit 21. It is preferable that the marker 36 can specify the position of the auscultation unit 12 as a point from the image of the auscultation unit 12 captured by the control unit 25 .
  • the marker 36 is composed of, for example, a pattern including circles. In this case, the control unit 25 can determine that the center of the circle is the position of the auscultation unit 12 .
  • the marker 36 is not limited to this.
  • the marker 36 can be a pattern in which two line segments intersect perpendicularly. In this case, it can be determined that the position of the auscultation unit 12 is the point where the two straight lines intersect.
  • the handle 37 is provided to prevent the user's hand from entering the imaging unit 21 side of the marker 36 and obstructing the imaging of the marker 36 when the user takes an image while holding the auscultation unit 12 . . It is also provided so that the patient can easily adjust the position by holding it with both hands. Various shapes can be adopted as the shape of the handle 37 .
  • the control unit 25 of the main unit 11 includes an image recognition unit 25a, a camera adjustment unit 25b, a direction guide unit 25c, a contact determination unit 25d, a waveform processing unit 25e, a waveform analysis unit 25f, an estimation It includes functional blocks of the section 25g and the determination section 25h.
  • the processing of each functional block may be executed by the same processor or by different processors.
  • the processing of each functional block may be performed by a single software module or may be performed by multiple software modules.
  • the processing of each functional block can be rearranged, separated, or combined. All the functions of each functional block can be regarded as functions of the control unit 25 .
  • the image recognition unit 25a recognizes the user's face, skeleton, etc. from the image captured by the imaging unit 21.
  • the image recognition unit 25a can identify feature points from the recognized user's face, skeleton, and the like. Feature points include, for example, the user's left and right shoulder joints or acromion portions, as well as the eyes, ears, nose, mouth and chin of the face.
  • the image recognition unit 25a can further recognize the marker 36 of the auscultation unit 12 from the image captured by the imaging unit 21 and specify the position of the auscultation unit 12 .
  • the image recognition unit 25a can further recognize at least one of the distance between the imaging unit 21 and the user's body and the direction of the user's body from the image captured by the imaging unit 21.
  • the image recognition unit 25a can recognize when the orientation of the user's body deviates from the correct orientation, for example, the front. Further, when a predetermined range of the user's body is out of the field of view of the imaging section 21, the image recognition section 25a can recognize this.
  • the camera adjustment unit 25b controls the imaging adjustment unit 24 to change the imaging orientation of the imaging unit 21. can be adjusted.
  • the direction guide unit 25c identifies the position where the auscultation unit 12 should be placed based on the positions of the feature points recognized by the image recognition unit 25a and the relative position information stored in the storage unit 22.
  • the relative position information is information indicating the relative position of the position where the auscultation unit 12 should be placed with respect to the feature point.
  • the direction guide unit 25c displays the position where the auscultation unit 12 should be placed on the image displayed on the display unit 23.
  • the position where the auscultation unit 12 should be placed may be displayed by displaying a circular or square mark in a specific color such as red, or by blinking the mark.
  • the direction guide section 25c may further superimpose on the image displayed on the display section 23 to display the direction in which the auscultation section 12 should be moved by graphics such as straight lines and arrows, characters, and the like.
  • the direction in which the auscultation unit 12 should move is the direction toward the position where the auscultation unit 12 should be arranged.
  • the direction guide section 25c may guide the user in the direction in which the auscultation section 12 should be moved by voice from the speaker 27 .
  • the direction guide unit 25c vibrates the vibrating unit 34 so that the user does not know that the auscultation unit 12 is moving in the wrong direction. You can let me know you are there. Conversely, the direction guide section 25c may cause the vibration section 34 to vibrate when the user moves the auscultation section 12 in the direction in which the auscultation section 12 should be moved.
  • the direction guide unit 25c superimposes at least one of characters and graphics on the image displayed on the display unit 23, thereby And/or the user may be notified that the auscultation unit 12 is aligned with the correct measurement position by emitting sound using the speaker 27 and/or by vibrating the vibration unit 34 .
  • the contact determination unit 25d can determine the contact state of the auscultation unit 12 with the user's body based on the pressure detected by the pressure sensor 33.
  • the contact determination unit 25d may consider the waveform of the heart sound when determining the contact state. For example, the contact determination unit 25d determines that the pressure detected by the pressure sensor 33 is equal to or less than a predetermined value, and/or the waveform analysis unit 25f, which will be described later, does not detect at least one of sound I and sound II from the waveform of the heart sound. At this time, it may be determined that the pressure of the auscultation unit 12 against the user's body is insufficient.
  • the contact determination unit 25d uses at least one of the image displayed on the display unit 23 and the sound from the speaker 27 to instruct the user to increase the pressing pressure of the auscultation unit 12. can guide you to do so.
  • the auscultation unit 12 may be provided with a lamp that indicates that the pressing force of the auscultation unit 12 is insufficient.
  • the contact determination section 25 d may control lighting of this lamp according to the output of the pressure sensor 33 .
  • the waveform processing unit 25e removes noise from the waveform of the heart sound acquired by the auscultation unit 12 and converted into an electrical signal by the conversion unit 32. For example, the waveform processing unit 25e performs a filtering process on the electrical signal of heart sounds to remove noises not derived from heart beats, such as environmental sounds and breathing sounds. When the main unit 11 acquires electrocardiogram and pulse wave signals from the electrodes 13 and the arm band device 14, the waveform processing unit 25e may remove noise from the waveforms of these signals.
  • the waveform analysis unit 25f analyzes the cardiac waveform.
  • the waveform analysis unit 25f extracts feature amounts from the waveform of the heart sound from which noise has been removed by the waveform processing unit 25e.
  • the waveform analysis unit 25f for example, extracts the timings of the first and second heart sounds.
  • the waveform analysis unit 25f may analyze acceleration and fragmentation of the first sound and the second sound.
  • the waveform analysis unit 25f further analyzes the waveforms of the electrocardiogram and pulse wave.
  • the waveform analysis unit 25f for example, extracts the timing, width, magnitude, etc. of the Q wave, R wave, and S wave from the waveform of the electrocardiogram.
  • the waveform analysis unit 25f can extract, for example, the timing and duration of diastole and systole from the waveform of the pulse wave.
  • the waveform analysis unit 25f can time-synchronize and analyze the waveform of the electrical signal obtained by converting the heart sounds, and the waveforms of the electrocardiogram and the pulse wave obtained by the electrodes 13 and the arm band device 14 .
  • the estimation unit 25g calculates hemodynamic parameters based on the feature amount extracted by the waveform analysis unit 25f.
  • the hemodynamic parameters include at least one of left ventricular pressure and pulmonary artery pressure.
  • the estimation unit 25g may estimate the hemodynamic parameters by machine learning as described above.
  • the means by which the estimation unit 25g calculates the hemodynamic parameters is not limited to one using machine learning.
  • the determination unit 25h acquires hemodynamic parameters from the estimation unit 25g.
  • the determination unit 25h continuously monitors hemodynamic parameters and determines changes and/or abnormalities in the heart condition.
  • control unit 25 does not have to execute at least part of the processing of the waveform processing unit 25e, the waveform analysis unit 25f, the estimation unit 25g, and the determination unit 25h. These processes may be performed by the medical institution system 16 instead of the control unit 25 . Also, these processes may be performed by the server 18 provided at a location different from the medical institution.
  • a doctor at a medical institution uses the heart sound acquisition device 15 to bring the auscultation unit 12 into contact with the user's chest and searches for a position where heart sounds can be acquired satisfactorily (step S101).
  • the heart sound acquisition device 15 used at this time may be the same as or different from the device used by the user at home.
  • the control unit 25 of the main unit 11 determines whether heart sounds can be acquired from the auscultation unit 12 (step S102).
  • the control unit 25 can determine that the heart sounds have been acquired when the electrical signals of the heart sounds converted into electrical signals by the conversion unit 32 include predetermined sounds, such as the I sound and/or the II sound.
  • the control unit 25 can determine that the heart sound could not be acquired when the predetermined sound signal in the electrical signal of the heart sound cannot be identified because it is buried in noise. If the heartbeat could be acquired (step S102: Yes), the controller 25 proceeds to the next step S104. If the heartbeat could not be acquired (step S102: No), the controller 25 proceeds to step S103.
  • step S103 the control unit 25 uses the display unit 23 and/or the speaker 27 to guide the user to correct the position of the auscultation unit 12.
  • the doctor changes the position of the auscultation unit 12 to acquire the heart sound according to the guidance.
  • step S103 the process returns to step S101, and the doctor again brings the auscultation unit 12 into contact with the user's chest.
  • the doctor determines the position of the auscultation unit 12 according to guidance from the control unit 25, but the present invention is not limited to this, and the doctor may determine the position for acquiring heart sounds by auscultation.
  • step S102 when the auscultation unit 12 can acquire heart sounds, the doctor puts the auscultation unit 12 against the user's chest, and the control unit 25 of the main unit 11 controls the imaging unit 21 to detect the upper body of the user. Then, the auscultation unit 12 including the marker 36 is imaged (step S104).
  • step S105 the control unit 25 attempts to recognize the user's face and skeleton in the captured image.
  • step S105: No the control unit 25 proceeds to step S106.
  • step S106 the control unit 25 adjusts the orientation of the imaging unit 21 so that the user's image is included in the captured image.
  • the control section 25 may control the imaging adjustment section 24 to automatically adjust the orientation of the imaging section 21 . If the orientation of the imaging unit 21 cannot be adjusted within the adjustable range of the imaging adjustment unit 24, the control unit 25 controls the display unit 23 and the speaker 27 to move the main unit 11 to adjust the orientation of the imaging unit 21. Advise the physician to change The doctor adjusts the orientation of the imaging unit 21 accordingly. After adjusting the orientation of the imaging unit 21, the control unit 25 returns to the process of step S104.
  • step S105 if the user's face, skeleton, etc. can be recognized (step S105: No), the control unit 25 extracts the user's feature points from the image captured by the imaging unit 21 (step S107).
  • the feature points include, for example, the left shoulder 41 and right shoulder 42, and the eyes, nose, mouth, ears, and chin included in the face 43, as shown in FIG.
  • Left shoulder 41 and right shoulder 42 may be, for example, near the shoulder joint or acromion. These feature points can be distinguished to some extent even if the user is wearing clothes.
  • Feature points may also include the user's clavicle and/or nipple. In order to recognize these feature points, the user needs to be imaged with the upper half of the body naked.
  • feature points of the left shoulder 41, the right shoulder 42, and the face 43 which are recognizable to some extent even when the user is wearing clothes, are used.
  • the control unit 25 determines two axes based on the feature points (step S108).
  • the control unit 25 determines, for example, the median line and the horizontal line as two axes.
  • the median line is a line connecting the center point of a set of symmetrically positioned feature points such as the eyes and the mouth or nose.
  • the midline can be the y-axis.
  • the horizontal line is, for example, a line connecting symmetrically positioned feature points such as the left shoulder 41 and the right shoulder 42 .
  • the horizontal line can be the x-axis perpendicular to the y-axis.
  • the control unit 25 determines the position of the auscultation unit 12 on the image captured by the imaging unit 21 (step S109).
  • the position of the auscultatory portion 12 is determined by the position of the marker 36 on the image.
  • the position of the auscultation unit 12 can be expressed as coordinates on the x-axis and y-axis determined in step S108. That is, the position of the auscultation unit 12 is expressed as relative position information indicating the relative position with respect to the feature point on the image.
  • the control unit 25 stores the relative position information of the auscultation unit 12 in the storage unit 22 and/or transmits it to the server 18 (step S110).
  • the control unit 25 stores the relative position information of the auscultation unit 12 in the storage unit 22. You can remember.
  • the control unit 25 determines the position where the auscultation unit 12 should be placed so that the user can read from home.
  • the relative position information is stored in server 18 .
  • the relative position information of the position where the auscultation unit 12 should be placed stored in the server 18 is acquired from the server 18 by the control unit 25 via the communication unit 26 before the user uses the heart sound acquisition device 15 at home. It is stored in the storage unit 22 .
  • a doctor uses a device similar to the heart sound acquisition device 15 used by a patient at home at a medical institution.
  • a dedicated device different from the device used by the patient at home may be used to specify the position where the auscultation unit 12 should be placed.
  • the user wears the electrodes 13 for acquiring an electrocardiogram and the arm band device 14 for acquiring a pulse wave (step S201).
  • the user positions the auscultation unit 12 to acquire heart sounds (step S202). Details of the positioning of the auscultatory portion 12 are described below with reference to FIG.
  • the procedure shown in the flowchart of FIG. 10 corresponds to the heart sound acquisition method of the present disclosure.
  • the user places the main unit 11 on a desk or the like so that the imaging unit 21 of the heart sound acquisition device 15 is positioned and oriented appropriately for imaging the user's upper body.
  • the imaging section 21 is independent of the main body section 11, the user arranges the imaging section 21 at an appropriate position.
  • the user takes an image of his or her upper body using the imaging unit 21 (step S301).
  • the control section 25 of the main body section 11 displays the image captured by the imaging section 21 on the display section 23 .
  • the image displayed on the display unit 23 can be a left-right reversed image so that it looks like a mirror image to the user.
  • the imaging section 21 of the main body section 11 may continuously capture images of the user. The following processing may be performed on successively captured images.
  • the control unit 25 of the main unit 11 attempts to recognize the user's face and skeleton in the image captured by the imaging unit 21 (step S302).
  • the control unit 25 proceeds to the next step S304.
  • the controller 25 proceeds to step S303.
  • step S303 the control unit 25 adjusts the position and orientation of the imaging unit 21.
  • the control section 25 may control the imaging adjustment section 24 to automatically adjust the orientation of the imaging section 21 .
  • the control unit 25 controls the display unit 23 and/or the speaker 27 to move the main unit 11 to change the direction of the imaging unit 21. to the user. The user adjusts the position and orientation of the imaging unit 21 accordingly.
  • step S ⁇ b>304 the control unit 25 attempts to display the position where the auscultation unit 12 should be placed so as to be superimposed on the image of the user's upper body displayed on the display unit 23 .
  • the control unit 25 extracts feature points included in the image captured by the imaging unit 21 from the user's face and skeleton recognized in step S302. Based on the feature points, the control unit 25 identifies the axes corresponding to the two axes identified in step S107. Based on the relative position information stored in the storage unit 22, the control unit 25 can specify the position where the auscultation unit 12 should be arranged as coordinates on these two axes.
  • the position where the auscultation unit 12 should be placed is displayed as the target position P on the image.
  • the target position P may be displayed on the image with any mark. Since the image of the user's upper body displayed on the display unit 23 is a left-right reversed image, the target position P corresponding to the position where the auscultation unit 12 should be arranged, which is set in advance by the medical institution, is left-right reversed. is displayed at the position where the
  • step S304: Yes If the target position P at which the auscultation unit 12 should be placed can be superimposed on the image displayed on the display unit 23 (step S304: Yes), the control unit 25 proceeds to the next step S305. If the target position P where the auscultation unit 12 should be placed cannot be superimposed on the image displayed on the display unit 23 (step S304: No), it is considered that the range captured by the imaging unit 21 is not appropriate. be done. In this case, the control unit 25 proceeds to step S303 described above and adjusts the position and orientation of the imaging unit 21 .
  • step S304 the distance from the user's body to the imaging unit 21 and the orientation (inclination) of the user's body with respect to the imaging unit 21 may differ from the conditions when the image was captured at the medical institution.
  • the control unit 25 Based on the arrangement of the feature points included in the image captured by the imaging unit 21, the control unit 25 converts the coordinates of the auscultation unit 12 in the image captured by the user at the medical institution to the coordinates of the image captured at home. Coordinates can be transformed.
  • the control unit 25 uses the display unit 23 and/or the speaker 27 so that the position, orientation, and size of the user's body substantially match those of the images taken at the medical institution. , may guide the user to change position and orientation.
  • step S305 while viewing the image displayed on the display unit 23, the user brings the auscultation unit 12 into contact with the position of the user's own chest corresponding to the position where the target position P is displayed on the display unit 23 (step S305). S305).
  • the control unit 25 uses the display on the display unit 23 and/or the sound from the speaker 27 to instruct the user to bring the auscultation unit 12 into contact with the target position P displayed on the display unit 23. can be encouraged.
  • the control unit 25 detects the image of the marker 36 included in the image captured by the imaging unit 21 and attempts to recognize the position of the auscultation unit 12. (Step S306). If the auscultation unit 12 is recognized (step S306: Yes), the control unit 25 proceeds to the process of step S308. If the auscultation unit 12 cannot be recognized (step S306: No), the control unit 25 proceeds to step S307. In this case, it is considered that the reason why the auscultation unit 12 cannot be recognized is that it is not within the recognizable range on the image.
  • step S307 the control unit 25 causes the display unit 23 to display characters and/or the speaker 27 to generate sound, thereby instructing the user to change the position of the auscultation unit 12 because it is not in the correct position. guide you to do so. According to this guidance, the user changes the position of the auscultation unit 12 placed on the chest.
  • step S307 the process of the flowchart returns to step S305.
  • step S308 the control unit 25 guides the movement direction and movement distance of the auscultation unit 12 from the coordinates of the target position P and the coordinates of the current position of the auscultation unit 12 included in the image captured by the imaging unit 21. do.
  • the control unit 25 guides the movement direction and distance by superimposing graphics and/or characters on the image displayed on the display unit 23, and also by sound from the speaker 27 and vibration from the vibration unit 34. can do.
  • the control unit 25 may display "3 cm downward" on the image displayed on the display unit 23.
  • FIG. the control unit 25 may utter the same content by voice.
  • the control unit 25 instructs the user to perform auscultation by blinking the color of the mark indicating the target position P, shortening the interval between sounds and vibrations, and the like. It is possible to guide the direction in which the unit 12 is moved.
  • the control unit 25 changes the current position of the auscultation unit 12 to the target position P (positioning the auscultation unit 12). position).
  • the control unit 25 determines that the current position of the auscultation unit 12 matches the target position P (step S309: Yes)
  • the process proceeds to step S310.
  • the control unit 25 determines that the current position of the auscultation unit 12 does not match the target position P (step S309: No)
  • the control unit 25 returns to step S308 and adjusts the position of the auscultation unit 12.
  • the control unit 25 uses the display unit 23, the speaker 27 and/or the vibration unit 34 to notify the user that the auscultation unit 12 is at the target position P (step S310).
  • the control unit 25 changes the color of the mark indicated by the target position P displayed on the display unit 23, and/or generates a voice saying "correct position set" through the speaker 27, and / Or vibrate the vibrating section 34 .
  • the display unit 23, the speaker 27 and/or the vibration unit 34 are included in the notification unit that notifies that the auscultation unit 12 is positioned at the position where it should be placed.
  • step S310 When it is notified in step S310 that the auscultation unit 12 is at the target position P, the user fixes the auscultation unit 12 at that position (step S311). That is, the user does not move the hand holding the handle 37 of the auscultation unit 12 at this position.
  • the control unit 25 acquires the electrical signal of the heart sound from the auscultation unit 12 .
  • the control unit 25 determines whether the heart sounds are correctly acquired from the waveform of the electrical signal of the heart sounds (step S312).
  • the control unit 25 can determine whether the contact state of the auscultation unit 12 with the user's body is good or bad based on the waveform of the heart sound converted into the electrical signal.
  • the control unit 25 may acquire the pressure detected by the pressure sensor 33 in addition to the waveform of the heart sound converted into the electrical signal, and determine the contact state of the auscultation unit 12 with the user's body.
  • step S313 the control unit 25 causes the display unit 23 and the speaker 27 to finely adjust the pressing force of the auscultation unit 12 and/or the position of the auscultation unit 12.
  • Guidance is provided (step S313). If the pressure of the auscultation unit 12 against the user's chest is insufficient, the control unit 25 may notify the user to increase the pressure. Also, if the pressure of the auscultation unit 12 against the chest of the user is sufficient, the auscultation unit 12 may be guided to move a minute distance from the current position. After step S313, the user fixes the auscultation unit 12 again (step S311), and the control unit 25 performs the process of step S312 again.
  • step S312 Yes
  • the control unit 25 returns to the processing of the flowchart in FIG.
  • the control unit 25 starts automatic measurement of heart sounds, electrocardiograms, and pulse waves.
  • a heart sound to be measured is a waveform converted into an electrical signal by the converter 32 .
  • the control unit 25 analyzes the heart sound, electrocardiogram, and pulse wave waveforms measured in step S203 (step S204).
  • the control unit 25 can time-synchronize each waveform of the heart sound, the electrocardiogram, and the pulse wave.
  • the control unit 25 can extract feature amounts from each waveform.
  • the control unit 25 calculates hemodynamic parameters based on the analysis results of each waveform in step S204.
  • the control unit 25 may calculate the hemodynamic parameters by machine learning in which the feature values of each waveform are used as input parameters and the hemodynamic parameters are output as described above.
  • Hemodynamic parameters include left ventricular pressure and pulmonary artery pressure.
  • the control unit 25 determines whether the hemodynamic parameters calculated in step S205 are normal (step S206). For example, a hemodynamic parameter is determined to be abnormal if the hemodynamic parameter contains abnormal values that cannot be measured by normal human measurements. Alternatively, when the hemodynamic parameters cannot be calculated, such as when each measured waveform is distorted and the feature amount cannot be calculated, it is determined to be abnormal. If it is determined that the hemodynamic parameter is not normal (step S206: No), the control unit 25 returns to step S203 and repeats the measurement. If the hemodynamic parameters are determined to be normal (step S206: Yes), the control unit 25 proceeds to the process of step S207.
  • step S ⁇ b>207 the control unit 25 displays the measurement results including the hemodynamic parameters on the display unit 23 and/or stores them in the storage unit 22 and/or transmits them to the medical institution system 16 .
  • the control unit 25 may cause the display device 23 to display the waveform of the physical information and/or the time-series transition of the hemodynamic parameter as a graph.
  • the control unit 23 may further cause the display device 23 to display the determination result as to whether the hemodynamic parameters are normal. From the display on the display device 23, the user can know information such as the change in the measurement result from the previous day, the trend, and whether or not the measurement result is within the reference value.
  • the control unit 25 can further present a response such as "continue measurement” or "contact a doctor urgently" to the user.
  • doctors and/or nurses can continuously monitor the received hemodynamic parameters to check the status of the user, change prescriptions, and the like. After confirming the hemodynamic parameters, the doctor or nurse at the medical institution can send comments or instructions from the medical institution system 16 to the main unit 11 .
  • the control unit 25 of the main unit 11 may display the comment or instruction from the doctor or nurse on the display unit 23 to notify the user.
  • the processing from steps S204 to S207 is assumed to be executed by the control unit 25 of the main unit 11.
  • the hemodynamic monitoring system 10 is configured such that the main unit 11 transmits the results measured in step S203 to the medical institution system 16, and the medical institution system 16 analyzes the measurement results and calculates hemodynamic parameters. may be
  • the position where the auscultation unit 12 should be placed is determined in advance by the doctor, and this position is used as a relative position with respect to the feature points of the user's body. 22. Then, the control unit 25 arranges the auscultation unit 12 on the image displayed on the display unit 23 based on the user's feature points included in the captured image and the stored relative positions. Guide the user to the location. This allows the user to easily position the auscultation unit 12 and acquire heart sounds.
  • the patient himself/herself can wear the auscultation unit 12 at the same position every time and acquire a stable heart sound waveform. This eliminates the need for support from doctors, family members, caregivers, etc., and makes it easier to continue measurement even when family members, caregivers, etc. are absent. Furthermore, since the cardiac waveform can be stably acquired, the accuracy of the algorithm for analyzing the cardiac waveform can be improved.
  • the heart sound acquisition device 15 of the present embodiment displays an image captured by the imaging unit 21 on the display unit 23, and guides the user on the position where the auscultation unit 12 should be placed on the image. Intuitive and easy to understand. Therefore, the heart sound acquisition device 15 is easy for the patient to use and to continue the measurement.
  • the left shoulder, right shoulder, and face are used as feature points of the user's body used for positioning the auscultation unit 12 . This makes it possible to extract feature points and acquire heart sounds even when the user is wearing clothes.
  • the electrocardiogram and the pulse wave, as well as the cardiac sound waveform can be synchronously and stably acquired. It becomes possible to monitor in a non-invasive way. This makes it possible to grasp the state of a heart failure patient at home and to respond when the state changes.
  • the heart sound acquisition device 15 is used to monitor the hemodynamic parameters of the user, who is a patient, at home.
  • the heart sound acquisition device can be installed in a medical facility to acquire heart sounds under the same conditions each time for each patient.
  • the storage unit 22 stores relative position information regarding the position where the auscultation unit 12 should be placed for each patient.
  • the control unit 25 reads the relative position information according to the patient from the storage unit 22 and positions the auscultation unit 12 with respect to each patient.
  • the display unit 23 may be configured to display an image that is not horizontally reversed toward the doctor side.
  • an external camera can be used as the imaging unit 21 instead of the camera built into the main unit 11 .
  • a smartphone it is possible to connect a smartphone to the main unit 11 and use the camera of the smartphone as the imaging unit 21 .
  • there are advantages such as no need for a camera in the main body 11, easy adjustment of the orientation of the camera since it is separated from the main body 11, and the ability to use two screens of the main body 11 and the smartphone.
  • a smartphone can be used as the main unit 11.
  • a camera of a smart phone is used as the imaging unit 21 in FIG.
  • a built-in memory of the smartphone is used as the storage unit 22 .
  • a smartphone display is used as the display unit 23 .
  • a processor of a smartphone is used as the control unit 25 .
  • a wireless communication function of a smartphone is used as the communication unit 26 .
  • a speaker built into the smartphone is used as the speaker 27 .
  • a dedicated stand capable of adjusting the orientation of the display of the smartphone may be prepared in cooperation with the smartphone.
  • An external connection terminal of the smartphone can be used to connect the auscultation unit 12 and other sensors that acquire body information.
  • the smartphone stores in its memory an application for causing the processor of the smartphone to function as the control unit 25 of the heart sound acquisition device 15 of the present disclosure.
  • the user activates this application.
  • the following operations can be performed in the same manner as the heart sound acquisition device 15 of the above embodiment. This makes it possible to acquire heart sounds without preparing special hardware.
  • the imaging unit 21 may include an infrared camera capable of imaging light with wavelengths in the infrared region. Infrared radiation can partially penetrate clothing. Since the imaging unit 21 includes an infrared camera, it is possible to capture an image of the user's body even if the user is wearing clothes, and extract feature points more accurately. As a result, even if the feature points used for positioning the auscultation unit 12 include a part that is hidden under the clothes, the user can position the auscultation unit 12 while wearing the clothes. become.
  • hemodynamic monitoring system 11 main unit 12 auscultation unit 13 electrode (physical information acquisition unit) 14 arm band device (physical information acquisition unit) 15 heart sound acquisition device 16 medical institution system 17 communication network 18 server 21 imaging unit 22 storage unit 23 display unit 24 imaging adjustment unit 25 control unit 26 communication unit 27 speaker 28 stand 31 heart sound acquisition unit 32 conversion unit 33 pressure sensor 34 vibration unit 36 Marker 37 Handle 41 Left shoulder (feature point) 42 right shoulder (feature point) 43 face P target position

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Animal Behavior & Ethology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Physics & Mathematics (AREA)
  • Surgery (AREA)
  • Acoustics & Sound (AREA)
  • Cardiology (AREA)
  • Pathology (AREA)
  • Biophysics (AREA)
  • Physiology (AREA)
  • Measuring Pulse, Heart Rate, Blood Pressure Or Blood Flow (AREA)

Abstract

Ce dispositif d'acquisition de bruits du coeur comprend une unité d'auscultation, une unité d'imagerie, une unité de stockage, une unité d'affichage et une unité de commande. L'unité d'auscultation est configurée pour pouvoir acquérir les bruits du coeur d'un utilisateur. L'unité d'imagerie capture une image du corps de l'utilisateur comprenant des points caractéristiques. L'unité de stockage stocke la position relative de l'endroit où l'unité d'auscultation doit être placée par rapport aux points caractéristiques de l'utilisateur. L'unité d'affichage affiche l'image capturée par l'unité d'imagerie. L'unité de commande reconnaît les positions des points caractéristiques de l'utilisateur à partir de l'image capturée par l'unité d'imagerie, calcule la position dans laquelle l'unité d'auscultation doit être placée sur la base des positions des points caractéristiques et de la position relative stockée dans l'unité de stockage, et guide l'unité d'auscultation vers la position dans laquelle l'unité d'auscultation doit être placée sur l'image affichée sur l'unité d'affichage.
PCT/JP2022/040255 2021-10-28 2022-10-27 Dispositif d'acquisition de bruits du coeur, système d'acquisition de bruits du coeur, procédé d'acquisition de bruits du coeur et programme Ceased WO2023074823A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021-176898 2021-10-28
JP2021176898 2021-10-28

Publications (1)

Publication Number Publication Date
WO2023074823A1 true WO2023074823A1 (fr) 2023-05-04

Family

ID=86159948

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/040255 Ceased WO2023074823A1 (fr) 2021-10-28 2022-10-27 Dispositif d'acquisition de bruits du coeur, système d'acquisition de bruits du coeur, procédé d'acquisition de bruits du coeur et programme

Country Status (1)

Country Link
WO (1) WO2023074823A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2025247569A1 (fr) * 2024-05-31 2025-12-04 Ibf Medical Dispositif et procédé de localisation anatomique assistée

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010525363A (ja) * 2007-04-23 2010-07-22 サムスン エレクトロニクス カンパニー リミテッド 遠隔医療診断システム及び方法
WO2013089072A1 (fr) * 2011-12-13 2013-06-20 シャープ株式会社 Dispositif de gestion des informations, procédé de gestion des informations, système de gestion des informations, stéthoscope, programme de gestion des informations, système de mesure, programme de contrôle et support d'enregistrement
JP2013123493A (ja) * 2011-12-13 2013-06-24 Sharp Corp 情報処理装置、聴診器、情報処理装置の制御方法、制御プログラムおよび記録媒体
JP2017000198A (ja) * 2015-06-04 2017-01-05 日本光電工業株式会社 電子聴診システム
US20170185737A1 (en) * 2014-09-12 2017-06-29 Gregory T. Kovacs Physical examination method and apparatus
JP2017136248A (ja) * 2016-02-04 2017-08-10 公立大学法人岩手県立大学 聴診システム
JP2021010576A (ja) * 2019-07-05 2021-02-04 株式会社村田製作所 心臓モニタシステムおよび同期装置

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010525363A (ja) * 2007-04-23 2010-07-22 サムスン エレクトロニクス カンパニー リミテッド 遠隔医療診断システム及び方法
WO2013089072A1 (fr) * 2011-12-13 2013-06-20 シャープ株式会社 Dispositif de gestion des informations, procédé de gestion des informations, système de gestion des informations, stéthoscope, programme de gestion des informations, système de mesure, programme de contrôle et support d'enregistrement
JP2013123493A (ja) * 2011-12-13 2013-06-24 Sharp Corp 情報処理装置、聴診器、情報処理装置の制御方法、制御プログラムおよび記録媒体
US20170185737A1 (en) * 2014-09-12 2017-06-29 Gregory T. Kovacs Physical examination method and apparatus
JP2017000198A (ja) * 2015-06-04 2017-01-05 日本光電工業株式会社 電子聴診システム
JP2017136248A (ja) * 2016-02-04 2017-08-10 公立大学法人岩手県立大学 聴診システム
JP2021010576A (ja) * 2019-07-05 2021-02-04 株式会社村田製作所 心臓モニタシステムおよび同期装置

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2025247569A1 (fr) * 2024-05-31 2025-12-04 Ibf Medical Dispositif et procédé de localisation anatomique assistée
FR3162614A1 (fr) * 2024-05-31 2025-12-05 Ibf Medical Dispositif et procédé de localisation anatomique assistée

Similar Documents

Publication Publication Date Title
US12414703B2 (en) Medical device system for remote monitoring and inspection
JP7132853B2 (ja) 対象上のウェアラブル装置の位置及び向きの少なくとも一方を決定する方法及び装置
US9854976B2 (en) Pulse wave velocity measurement method
US10635782B2 (en) Physical examination method and apparatus
JP7247319B2 (ja) 頭部、脊椎および身体の健康をモニタリングするためのシステムおよび方法
KR102265934B1 (ko) 모바일 단말을 이용한 맥파 신호 및 스트레스 측정 방법 및 장치
JP6381918B2 (ja) 動作情報処理装置
Sun et al. Photoplethysmography revisited: from contact to noncontact, from point to imaging
Hernandez et al. Biophone: Physiology monitoring from peripheral smartphone motions
CN104486989A (zh) Cpr团队表现
JP6692420B2 (ja) 血圧測定のための補助デバイス及び血圧測定装置
JP2017512510A (ja) スマートウェアラブル装置の身体位置最適化及び生体信号フィードバック
JP2014136137A (ja) 医用情報処理装置及びプログラム
CA2906856A1 (fr) Dispositifs auditifs mettant en ƒuvre des capteurs pour acquerir des caracteristiques physiologiques
JP7209954B2 (ja) 眼振解析システム
KR20130010207A (ko) 무구속 무자각 생체신호 획득을 통한 워치타입 건강상태 분석시스템
CN112911989B (zh) 移动监护测量方法、移动监护设备、系统和存储介质
US20240398358A1 (en) Wearable device for collecting physiological health data
US20210321886A1 (en) Portable monitoring apparatus, monitoring device, monitoring system and patient status monitoring method
JP2010188159A (ja) 身体動作センシング装置
TWM485701U (zh) 具醫護功能之穿戴件
KR20160108967A (ko) 생체신호 측정장치 및 이를 이용한 생체신호 측정방법
WO2023074823A1 (fr) Dispositif d'acquisition de bruits du coeur, système d'acquisition de bruits du coeur, procédé d'acquisition de bruits du coeur et programme
US11232866B1 (en) Vein thromboembolism (VTE) risk assessment system
CN113015479B (zh) 移动监测装置、监护设备、监护系统及病人状态监测方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22887148

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22887148

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP