[go: up one dir, main page]

WO2021044150A1 - Systems and methods for analysing breathing - Google Patents

Systems and methods for analysing breathing Download PDF

Info

Publication number
WO2021044150A1
WO2021044150A1 PCT/GB2020/052112 GB2020052112W WO2021044150A1 WO 2021044150 A1 WO2021044150 A1 WO 2021044150A1 GB 2020052112 W GB2020052112 W GB 2020052112W WO 2021044150 A1 WO2021044150 A1 WO 2021044150A1
Authority
WO
WIPO (PCT)
Prior art keywords
breathing
user
classified
data
sensor data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/GB2020/052112
Other languages
French (fr)
Inventor
George Edward WINFIELD
Yasin COTUR
Francesco GUAGLIARDO
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Spyras Ltd
Original Assignee
Spyras Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Spyras Ltd filed Critical Spyras Ltd
Publication of WO2021044150A1 publication Critical patent/WO2021044150A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/08Measuring devices for evaluating the respiratory organs
    • A61B5/0816Measuring devices for examining respiratory frequency
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/08Measuring devices for evaluating the respiratory organs
    • A61B5/0803Recording apparatus specially adapted therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7203Signal processing specially adapted for physiological signals or for diagnostic purposes for noise prevention, reduction or removal
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/725Details of waveform analysis using specific filters therefor, e.g. Kalman or adaptive filters

Definitions

  • the present techniques generally relate to systems, apparatus and methods for monitoring health, and in particular relate to monitoring health by sensing and analysing breathing.
  • breathing rate has been described as one of the most sensitive and important indicators of the deterioration of patient health.
  • breathing rate is monitored by occasional visual assessment, e.g. by observing the rise and fall of a patient's chest for 30 seconds every 12 hours.
  • breathing rate is monitored by occasional visual assessment, e.g. by observing the rise and fall of a patient's chest for 30 seconds every 12 hours.
  • some medical cases where breathing rate, and changes in breathing rate, have not been observed have led to avoidable patient death.
  • GB2550833 which describes techniques for analysing breathing data of the type obtained using structured light plethysmography (i.e. by projecting a pattern of light onto a patient) to identify breaths representing a breathing pattern over time
  • W02012/007719 which describes a method of identifying a breath by sensing a signal generated by a human
  • EP0956820 which describes breath analysis apparatus that comprises a spirometer and breath tube
  • US2013/079656 which describes extracting respiratory information from a photoplethysmograph signal
  • JP2019141597 which describes a signal processing device
  • W02014/128090 which describes a respiration monitoring system that can be adhered to a patient's torso.
  • a health monitoring system comprising: an apparatus comprising: a sensor for sensing breathing of a user using the apparatus, and a communication module for transmitting sensor data; and at least one remote processor for: receiving sensor data from the apparatus; and smoothing the received sensor data to generate a breathing pattern.
  • the sensor data collected by the sensor may be noisy and may need to be processed (i.e. smoothed) in order to generate a breathing pattern that accurately reflects the user's breathing.
  • a method for health monitoring comprising: sensing, using an apparatus, breathing of a user wearing the apparatus; generating a breathing pattern from the sensed data; and determining from the breathing pattern at least one breathing characteristic.
  • At least one breathing characteristic may be determined or derived from the breathing pattern.
  • the at least one remote processor may be located in any component in the system that is remote to the apparatus used/worn by the user. Multiple remote processors may be used to perform the smoothing of the sensor data (and to thereby generate a breathing pattern from the sensor data, and perform any other processing such as determining breathing characteristics), which may be located in the same or different components in the system.
  • the at least one remote processor may be in a user device and/or in a remote server.
  • one or more of the steps to smooth the sensor data may be performed by the processor(s) in the user device, and one or more steps may be performed by the processor(s) of the remote server.
  • the at least one remote processor may determine an indication of the health of the user from the at least one breathing characteristic.
  • the health monitoring system may be used in a variety of contexts.
  • the health monitoring system may be used by a user to monitor their own health.
  • a user may monitor their breathing during exercise, in which case the indication of the health of the user may comprise information on the user's fitness.
  • the indication of the health of the user may comprise information on whether the user's fitness has improved since the apparatus was last worn/used, or over a predetermined time (e.g. over the last 3 months). This may help the user to determine whether a new exercise regime is effective, for example.
  • the fitness information could be used by a personal trainer to devise or modify an exercise regime for the user.
  • a user may monitor their breathing while resting or stationary to determine their health, lung capacity or lung health.
  • the indication of the health of the user may comprise information on their lung capacity or lung health, and/or whether their lung capacity/health has improved since the apparatus was last worn/used, or over a predetermined time (e.g. over the last 3 months).
  • This information may help the user or a doctor/physician to determine if the user's lung health is improving or deteriorating following an illness, a respiratory illness (such as that caused by COVID-19), disease or surgery, or following the user quitting smoking (or switching from cigarettes to e- cigarettes), or to monitor the health of a user with a chronic condition such as cystic fibrosis or chronic obstructive pulmonary disease (COPD).
  • a respiratory illness such as that caused by COVID-19
  • COVID-19 chronic obstructive pulmonary disease
  • a user admitted to a hospital may wear/use the apparatus so that doctors and nurses in the hospital may monitor the user's breathing more regularly, remotely and without human involvement. This advantageously increases the chances of detecting changes in the user's breathing to be identified and actioned early, and reduces the risk of human error.
  • the system may further comprise at least one user interface for displaying the indication of the health of the user.
  • the user interface may be provided in any suitable manner or on any suitable device.
  • the user interface may be the display screen of an electronic device used by the user, such as a computer or smartphone.
  • the user interface may be the display screen on hospital equipment, such as a computer at a nurses' station or a tablet holding an electronic patient record.
  • the raw sensor data and/or the processed data (e.g. the breathing pattern) may be anonymised.
  • the anonymised data may be associated with a Unique Identifier (UID), which may link the processed data to a patient's personal health records, which may be stored by the hospital in the hospital's own secure servers for patient data privacy.
  • UID Unique Identifier
  • the at least one user interface may be on a user device, and the indication of the health of the user may comprise information on the user's fitness. Additionally or alternatively, the at least one user interface may be on a device in a hospital, and the indication of the health of the user may comprise a warning that the user's health is deteriorating. This may advantageously enable hospital staff to take action sooner than if breathing is monitored by infrequent observation.
  • breathing rate may be monitored through visual assessment every 12 hours, while in intensive care units (ICUs), specialist capnography devices may be used to monitor the concentration or volume of carbon dioxide exhaled by a patient.
  • ICUs intensive care units
  • Electronic techniques for measuring breathing is limited to piezoelectric sensors that measure chest wall movement, impedance measurements that measure changes in chest conductivity during inhalation and exhalation, and microphones to detect the sound of the lungs expanding and contracting.
  • these techniques suffer from a low signal to noise ratio and may require significant filtering to accurately determine breathing rate.
  • the remote processor(s) may smooth the received sensor data to generate a breathing pattern by: identifying a plurality of inflection points (i.e. turning points on a continuous plane curve) in the sensor data, and classifying each inflection point as a local maximum or a local minimum. This process may be called "peak detection". All of the inflection points in the data are detected/identified by determining if a data point N is greater or lesser in value than a data point that is immediately before data point N (i.e. N-l) and a data point that is immediately after data point N (i.e. N + l).
  • the data point N is identified as an inflection point and is classified as a local maximum (or peak). In the case where data point N is lesser than both data points N-l and N + l, the data point N is identified as an inflection point and is classified as a local minimum (or trough). Each inflection point that has been identified and classified may be saved in storage.
  • the at least one remote processor may determine whether each identified inflection point is indicative of a breathing pattern or of noise. If an inflection point is indicative of noise, it may need to be removed or ignored in order to smooth the sensor data and generate a breathing pattern that accurately reflects the user's breathing. (Preferably, the data point is simply ignored in subsequent processing). For example, if a consecutive peak and trough have a low amplitude, they may represent noise rather than breathing data and therefore need to be ignored when generating a breathing pattern from the sensor data. In other words, peaks and troughs that have low prominence may be removed. This process may be called “peak prominence detection” or "peak prominence threshold”.
  • the remote processor(s) may determine whether a distance between two adjacent inflection points, where one of the two inflection points is classified as a local maximum and another of the two inflection points is classified as a local minimum (i.e. the distance between a peak and a trough), is above a threshold distance. If the distance is below the threshold distance, the remote processor(s) may remove both of the two adjacent inflection points.
  • the threshold distance may be any suitable value which indicates that the amplitude or distance between a peak and trough is not representative of a breath. For example, in cases where analogue sensor values (e.g. voltages) have been converted into digital values, the threshold distance may be 1000.
  • the digital values may range from 0 to 65535 if a 16-bit ADC has been used for the conversion, and the threshold distance may be 1000.
  • the distance between a successive peak and trough must be more than 1000. This may be a reasonable threshold distance, since in a normal breathing pattern successive peaks and troughs are usually separated by 10,000, and even if a user takes shallow breaths, the distance is more than 1000.
  • the threshold distance may vary depending on, for example, the sensor and external/environmental conditions.
  • the threshold distance value may be calculated based on an initial calibration of the sensor (to account for variability between sensors), and the modified/adjusted based on sensor data to account for environmental changes. That is, the threshold distance may be based on a calibration performed for each sensor, such that the threshold distance may vary for individual sensors.
  • the remote processor(s) may also look at whether the time between two breaths are too close together to be representative of a real breathing pattern. For example, if two peaks or two troughs are close together, they may not represent a real breathing pattern as the short time between the successive peaks/troughs mean there is not enough time for inhalation or exhalation. Thus, the remote processor(s) may determine whether a time between two successive inflection points each classified as a local maximum (i.e. two adjacent peaks), or between two successive inflection points each classified as a local minimum (i.e. two adjacent troughs), is less than a predetermined time.
  • a local maximum i.e. two adjacent peaks
  • two successive inflection points each classified as a local minimum i.e. two adjacent troughs
  • one of the two inflection points may be removed by the remote processor(s) so that it is not used to generate the breathing pattern.
  • This process may be called “peak separation analysis” or “peak distance analysis”.
  • the predetermined time may be 0.6 seconds, which would equate to 100 breaths in a minute, or 0.7 seconds, which would equate to 86 breaths per minute.
  • Such high breathing rates never occur in humans, and therefore, may be sufficient to remove peaks that are not representative of a real breathing pattern while also catching rapid breathing/ hyperventilation/ tachypnoea.
  • the peak separation analysis may take into account information on the activity that the user is performing/undertaking while using the apparatus. That is, knowledge of this activity (e.g. sitting, running, walking, etc.) could be used to determine the peak separation analysis, as different activities may be associated with different peak separations (or number of breaths per minute). This means knowledge of different breathing rates associated with different activities could be used to perform the peak separation analysis.
  • a breathing pattern should be an alternating sequence of peaks and troughs, i.e. an alternating sequence of inhalation and exhalation.
  • the remote processor(s) may generate a breathing pattern by identifying consecutive inflection points that are both classified as a local maximum or as a local minimum, and removing one of the two consecutive inflection points. This process may be called "consecutive peaks or troughs detection or elimination".
  • the breathing pattern may be generated in real time. That is, the breathing pattern may be generated as soon as at least two peaks and a trough are received, or two troughs and one peak are received. In some cases, a single peak and trough (or trough and peak) may be sufficient to generate the breathing pattern. It will be understood that this enables real-time analysis to be performed. More data (e.g. more peaks and troughs) may enable a more accurate breathing pattern to be produced, and more accurate breathing characteristics (see below) to be determined. Thus, a breathing pattern and breathing characteristic may be generated/determined in real-time but may also change in real-time as more data is received.
  • the remove processor(s) may determine at least one breathing characteristic from the generated breathing pattern.
  • breathing characteristics that may be derived from the breathing pattern, and which may be used to provide feedback on the user's health or fitness.
  • the at least one breathing characteristic may be any one or more: inhalation speed, exhalation speed, inhalation to exhalation ratio, number of breaths per minute (which could be used to detect hyperventilation, hypocapnia, hypoventilation, hypercapnia, etc.), average breathing rate when wearing a resistive sports mask or resistive respiratory muscle training device (which may depend on the restriction level of the resistive sports mask), exertion score, and depth or volume of inhalation or exhalation (which may be indicative of lung capacity or fitness).
  • inhalation/exhalation speed, inhalation time and exhalation time, and flow rate and therefore the depth or volume of inhalation/exhalation.
  • the remote processor(s) may determine the inhalation speed from the breathing pattern by: measuring a time between an inflection point classified as a local maximum and a subsequent inflection point classified as a local minimum; and dividing a distance between the inflection point classified as a local maximum and the subsequent inflection point classified as a local minimum by the measured time.
  • the sensor measures conductivity as a function of time.
  • the conductivity may represent, in some cases, the changes in humidity as a user of the apparatus breathes in and out.
  • the decrease in conductivity over time between a peak and a trough may be indicative of a decrease in humidity over time as the user takes a breath (i.e.
  • the conductivity may represent, in some cases, the changes in temperature and/or change in pressure as a user breathes in and out. For example, an increase in pressure or temperature may occur on exhalation, and a decrease in pressure or temperature may occur on inhalation. Exactly what the conductivity represents may depend on the type of sensor used.
  • the remote processor(s) may determine the exhalation speed from the breathing pattern by: measuring a time between an inflection point classified as a local minimum and a subsequent inflection point classified as a local maximum; and dividing a distance between the inflection point classified as a local minimum and the subsequent inflection point classified as a local maximum by the measured time.
  • the sensor measures conductivity as a function of time.
  • the conductivity may represent, in some cases, the changes in humidity as a user of the apparatus breathes in and out.
  • the increase in conductivity over time between a trough and a peak may be indicative of an increase in humidity over time as the user exhales (i.e.
  • the conductivity may represent, in some cases, the changes in temperature and/or change in pressure as a user breathes in and out. Exactly what the conductivity represents may depend on the type of sensor used.
  • the remote processor(s) may determine the ratio from the breathing pattern by: dividing the inhalation speed by the exhalation speed, or preferably by dividing the inhalation time by the exhalation time.
  • the remote processor(s) may determine the breathing rate from the breathing pattern by: determining the number of inflection points classified as a local maximum in a minute. Determining the number of inflection points classified as a local maximum in a minute may comprise determining, when the breathing pattern is longer than a minute, an average number of inflection points classified as a local maximum in a minute. Alternatively, determining the number of inflection points classified as a local maximum in a minute may comprise extrapolating, when the breathing pattern is less than a minute, the breathing rate based on the number of inflection points in a duration of the breathing pattern.
  • the breathing rate may be extrapolated at a resolution of half-breath.
  • a half-breath resolution may not be useful for calculating breathing rate since inhalation and exhalation time are often different.
  • a half breath resolution may be useful to detect rapid changes in breathing rate that may be clinically relevant (e.g. coughing or a panic attack). These rapid changes would be averaged out if using a BPM calculated over a one- minute window. For example, if a person is coughing, their BPM may be 60 or more using a half breath resolution, but that is not indicative of their average BPM - rather, it is indicative of a specific, short-time event.
  • a resolution of one breath may be the smallest resolution from which an accurate breathing rate can be accurately calculated.
  • the remote processor(s) may determine the inhalation depth from the breathing pattern by: averaging a distance between an inflection point classified as a local maximum and a subsequent inflection point classified as a local minimum.
  • the remote processor(s) may determine the exhalation depth from the breathing pattern by: averaging a distance between an inflection point classified as a local minimum and a subsequent inflection point classified as a local maximum. Exhalation depth may enable short breaths and long breaths to be identified, and may enable shallow breaths and deep breaths to be identified.
  • the at least one user interface may display information in addition to the breathing characteristic or indication of the health or fitness of the user.
  • the user interface may display one or more of: a total number of hours the user has worn the apparatus, an exertion score, an indication of the user's lung function, and information on whether the sensor needs to be replaced.
  • the remote processor(s) may be arranged to: compare sensor data received from the apparatus over a predetermined time period; and determine whether the accuracy of the sensor has changed over the predetermined time period.
  • the remote server may be able to identify any changes in the sensitivity or accuracy of the sensor over time, by considering whether, for example, the maximum and minimum values sensed by the sensor have changed over time.
  • the remote server may be able to send a message to a user device to indicate to the user (or to a hospital staff member or administrator) that the sensor needs to be replaced.
  • the apparatus may further comprise an accelerometer to sense movement of the user while the user is wearing the apparatus.
  • the accelerometer data may be transmitted to the remote server along with the sensor data.
  • the accelerometer data may contribute to the generation of a breathing pattern and/or the determination of the at least one breathing characteristic.
  • the accelerometer data may be mapped to or matched to the generated breathing pattern, which may enable analysis of the user's health or fitness while sedentary, walking and/or exercising to be determined. This may enable information to be provided to the user on how their exercise regime could be changed to improve their health, fitness or performance.
  • the remote processor(s) may use data from the accelerometer to generate the breathing pattern and determine the at least one breathing characteristic.
  • the remote server may use additional input data to generate the breathing pattern and determine the at least one breathing characteristic.
  • the additional input data may comprise one or more of: geographical location of the user, altitude data, weather data, humidity data, air quality index, pollution data, pollen data, and oxygen level.
  • the additional input data may be obtained by the remote server from external sources or third party sources.
  • the weather data may be obtained from a national weather service provider (such as the Met Office in the UK), while altitude data may be obtained from a map provider (such as via open APIs).
  • the remote processor(s) may determine a baseline humidity using the humidity data and may use the baseline humidity to generate the breathing pattern. Knowing the humidity of the environment in which the user is located may enable the breathing pattern to be generated more accurately. Similarly, other data such as pressure and external/environmental temperature may be used.
  • the geographical location or altitude data may enable the user's breathing pattern to be analysed in the context of the air pressure in their environment. For example, if a user who normally lives close to sea level is wearing the apparatus in the mountains or at a higher altitude where the air pressure is lower, the user's breathing pattern may indicate that they are breathing at a higher rate or taking deeper breaths. However, by knowing that the user is at a higher altitude, the change in their breathing rate may be considered acceptable and not a cause for concern. However, if the user's breathing rate increased while they were at home/close to sea level, then the change may be considered a cause for concern.
  • the sensor of the apparatus may be any one of: a thermistor, a humidity sensor, a gas sensor, a pressure sensor, a microphone, a sound sensor/detector, and a sensor comprising a porous material. It would be understood that this is an example, non-exhaustive list of possible sensors that are suitable for sensing breathing. It will also be understood that an apparatus may comprise more than one sensor, and that the sensors may be the same or different.
  • the apparatus may be any one of: a wearable apparatus, a resistive sports mask, an oxygen deprivation mask, an apparatus worn over a user's mouth and/or nose, a medical breath monitoring apparatus, a face mask, a disposable face mask, a personal protection equipment face mask, a surgical mask, an oxygen mask, an inhaler, an asthma inhaler, an e-cigarette, a heat moisture exchanger, and a nasal cannula. It would be understood that this is an example, non- exhaustive list of possible types of apparatus that could be used to sense breathing and monitor user health.
  • the apparatus may be any device which is able to be placed in the proximity of exhaled air or which can receive exhaled air (e.g. via tubes that direct exhaled air from the user to the apparatus).
  • the sensor and communication module may be removably attached to the apparatus. This may be useful if the apparatus is a disposable device such as a disposable mask, or a reusable device that is washed for reuse, such as a washable mask.
  • the sensor and communication module may be removed before the apparatus is disposed of (enabling reuse of the sensor and communication module), or before the apparatus is washed.
  • the sensor and communication module may be irremovably attached to the apparatus. This may be achieved by integrating the sensor and communication module into the apparatus, by any suitable means.
  • the remote processor may: determine an indication of the health of the user from the at least one breathing characteristic; and transmit the indication of the health of the user to any one or more of: a user device, a third party device, and a third party server.
  • the remote processor may be configured to: transmit the received sensor data to a third party device or third party server.
  • the remote processor may be configured to: transmit the generated breathing pattern to a third party device or third party server.
  • the transmission in any case to any device may be in real-time. This may advantageously enable the user or a third party to see real-time data for a user, which could be particularly useful for clinicians or in a hospital.
  • the remote processor may be configured to: receive sensor data from a third party server for analysis; smooth the received sensor data from the third party server to generate a breathing pattern; and transmit the generated breathing pattern to the third party server. That is, third parties may send sensor data to be processed by the remote processor.
  • the method may further comprise: providing, to a user device, breathing exercises for the user to follow while using the apparatus; and analysing, using the received sensor data, user performance while undertaking the breathing exercises.
  • the method may further comprise determining an exertion score by: calculating, using the breathing characteristic, a distribution profile of the breathing characteristic; determining a scaling factor; scaling the distribution profile using the scaling factor to generate a scaled distribution profile; and determining the exertion score using the scaled distribution profile.
  • the breathing characteristic may be breathing rate (i.e. breaths per minute or BPM).
  • the breathing characteristic may be a breath depth.
  • the method may comprise determining the breath depth from the breathing pattern by: determining an average breath depth for a predetermined number of breaths; comparing a depth of a subsequent breath to the average breath depth; and classifying the subsequent breath as a shallow breath when the depth of the subsequent breath is below the average breath depth by a threshold value, or as a deep breath when the depth of the subsequent is above the average breath depth by a threshold value.
  • the method may further comprise providing the user, using a total number of breaths recorded using the apparatus, with an indication of when to replace the sensor of the apparatus. This may be advantageous as it may indicate, ahead of the end of the lifetime of the sensor, when the sensor needs to be replaced (if it is replaceable), or when a new apparatus needs to be used.
  • a (non- transitory) computer readable medium carrying processor control code which when implemented in a system causes the system to carry out any of the methods, processes and techniques described herein.
  • present techniques may be embodied as a system, method or computer program product. Accordingly, present techniques may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects.
  • the present techniques may take the form of a computer program product embodied in a computer readable medium having computer readable program code embodied thereon.
  • the computer readable medium may be a computer readable signal medium or a computer readable storage medium.
  • a computer readable medium may be, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing.
  • Computer program code for carrying out operations of the present techniques may be written in any combination of one or more programming languages, including object oriented programming languages and conventional procedural programming languages. Code components may be embodied as procedures, methods or the like, and may comprise sub-components which may take the form of instructions or sequences of instructions at any of the levels of abstraction, from the direct machine instructions of a native instruction set to high-level compiled or interpreted language constructs. Embodiments of the present techniques also provide a non-transitory data carrier carrying code which, when implemented on a processor, causes the processor to carry out any of the methods described herein.
  • the techniques further provide processor control code to implement the above-described methods, for example on a general purpose computer system or on a digital signal processor (DSP).
  • DSP digital signal processor
  • the techniques also provide a carrier carrying processor control code to, when running, implement any of the above methods, in particular on a non-transitory data carrier.
  • the code may be provided on a carrier such as a disk, a microprocessor, CD- or DVD-ROM, programmed memory such as non-volatile memory (e.g. Flash) or read-only memory (firmware), or on a data carrier such as an optical or electrical signal carrier.
  • Code (and/or data) to implement embodiments of the techniques described herein may comprise source, object or executable code in a conventional programming language (interpreted or compiled) such as C, or assembly code, code for setting up or controlling an ASIC (Application Specific Integrated Circuit) or FPGA (Field Programmable Gate Array), or code for a hardware description language such as Verilog (RTM) or VHDL (Very high speed integrated circuit Hardware Description Language).
  • a controller which includes a microprocessor, working memory and program memory coupled to one or more of the components of the system.
  • a logical method may suitably be embodied in a logic apparatus comprising logic elements to perform the steps of the above-described methods, and that such logic elements may comprise components such as logic gates in, for example a programmable logic array or application-specific integrated circuit.
  • Such a logic arrangement may further be embodied in enabling elements for temporarily or permanently establishing logic structures in such an array or circuit using, for example, a virtual hardware descriptor language, which may be stored and transmitted using fixed or transmittable carrier media.
  • the present techniques may be implemented using multiple processors or control circuits. The present techniques may be adapted to run on, or integrated into, the operating system of an apparatus.
  • the present techniques may be realised in the form of a data carrier having functional data thereon, said functional data comprising functional computer data structures to, when loaded into a computer system or network and operated upon thereby, enable said computer system to perform all the steps of the above-described method.
  • Figure 1 shows a schematic diagram of a health monitoring system
  • Figure 2 shows a flowchart of example steps performed by the health monitoring system of Figure 1;
  • Figure 3A shows example sensor data sensed by the health monitoring system of Figure 1 after peak detection has been performed
  • Figure 3B shows the sensor data of Figure 3A after peak prominence detection has been performed
  • Figure 3C shows the sensor data of Figure 3B after peak separation analysis has been performed
  • Figure 3D shows an example generated breathing pattern
  • Figures 4A and 4B show, respectively, data used to determine a technique for generating an exertion score from sensor data in real-time (synchronously) or asynchronously;
  • Figure 4C shows data used to determine another technique for generating an exertion score from sensor data asynchronously; and Figure 5 shows a flowchart of example steps to determine an exertion score.
  • embodiments of the present techniques provide a health monitoring system which uses user breathing data to determine a user's or patient's health.
  • FIG. 1 shows a schematic diagram of a health monitoring system 100.
  • the system 100 comprises an apparatus 102 and a remote processor(s) 120.
  • the apparatus 102 may be a wireless apparatus or a wired apparatus. In other words, the apparatus 102 may be capable of wirelessly transmitting data to another device, or may need a wired connection to transmit data to another example.
  • the apparatus 102 may comprise at least one sensor 104 for sensing breathing of a user wearing the apparatus 102.
  • the at least one sensor 104 may be any one of: a thermistor, a humidity sensor, a gas sensor, a pressure sensor, a microphone, a sound sensor, and a sensor comprising a porous material.
  • An example of a sensor comprising a porous material can be found in International Patent Publication No.
  • the apparatus 102 may comprise a communication module 106 for transmitting sensor data.
  • the data collected by the sensor 104 may be transmitted to an external device or server for storage and analysis. This may be advantageous because the apparatus 102 may not have the processing power or capacity to analyse the data, and/or the storage capacity to store large quantities of data.
  • the data collected by the sensor 104 may be transmitted periodically to an external device/server, such as every second, or every few seconds, or at any suitable frequency such as, but not limited to, 12.5Flz or 20Flz. Alternatively, data collected by the sensor 104 may be transmitted at longer intervals or at irregular times in certain circumstances. For example, if the apparatus 102 is not within range to be able to communicate with an external device (e.g.
  • the apparatus 102 may have storage or memory 132 to temporarily store sensor data collected by the sensor 104 when real-time communication to an external device is not possible.
  • the storage 132 may comprise a volatile memory, such as random access memory (RAM), for use as temporary memory.
  • the storage 132 may comprise non-volatile memory such as Flash, read only memory (ROM), or electrically erasable programmable ROM (EEPROM), for storing data, programs, or instructions, for example.
  • the data collected by the sensor 104 may be communicated/transmitted to the remote server 120 directly, or via an intermediate device. Communicating the sensor data to an intermediate device may be preferable due to the communication capability of the communication module 106. For example, if the communication module 106 is only capable of short range communication, then the sensor data may need to be transmitted to an intermediate device which is capable of transmitting the sensor data to the remote sever. In some cases, the communication module 106 may be able to communicate directly with the remote server. In some cases, the intermediate device may process the sensor data into a format or into processed data that the remote server can handle.
  • the communication module 106 may be able to communicate using a wireless communication protocol, such as WiFi, hypertext transfer protocol (HTTP), a wireless mobile telecommunication protocol, short range communication such as radio frequency communication (RFID) or near field communication (NFC), or by using the communication protocols specified by ZigBee, Thread, Bluetooth, Bluetooth LE, Z-Wave, IPv6 over Low Power Wireless Standard (6L0WPAN), Long Range Wide Area Network (LoRaWAN), Low-power Wide-area network (LPWAN), Constrained Application Protocol (CoAP), SigFox, or WiFi-HaLow.
  • the communication module 106 may use a wireless mobile (cellular) telecommunication protocol to communicate with remote machines, e.g. 3G, 4G, 5G, etc.
  • the communication module 106 may use a wired communication technique to transfer sensor data to an intermediate/external device, such as via metal cables (e.g. a USB cable) or fibre optic cables.
  • the communication module 106 may use more than one communication technique to communicate with other components in the system 100.
  • the apparatus 102 may comprise a processor or processing circuitry 108.
  • the processor 108 may control various processing operations performed by the apparatus 102, such as communicating with other components in system 100.
  • the processor 108 of the apparatus 102 may simply control the operation of the sensor 104, communication module 106 and storage 132.
  • the processor 108 may have some further processing capability.
  • the processor 108 may comprise processing logic to processor data (e.g. the sensor data collected by sensor 104), and generate output data/signals/messages in response to the processing.
  • the processor 108 may be able to compress the sensor data for example, to reduce the size of the data that is being transmitted to another device.
  • the processor 108 may comprise one or more of: a microprocessor, a microcontroller, and an integrated circuit.
  • the apparatus 102 may optionally comprise an accelerometer 133 to sense movement of the user while the user is wearing the apparatus 102.
  • the accelerometer data may be transmitted to an external device along with the sensor data.
  • Apparatus 102 may optionally comprise an interface 138 for providing feedback on a user's breathing.
  • the interface 138 may be one or more LEDs or other lights which may turn on and off according to the generated breathing pattern. This may provide a visual indicator to a user of the apparatus or a third party (e.g. a doctor or personal trainer) of the generated breathing pattern.
  • the system 100 may comprise a remote server 120 for performing one or more of the steps involved in smoothing the sensor data received from the apparatus 102.
  • the apparatus 102 may transmit sensor data to the remote server 120.
  • the remote server 120 may then generate a breathing pattern from the sensor data, and determine from the breathing pattern at least one breathing characteristic.
  • the remote server 120 may comprise at least one processor 123 and storage 122.
  • Storage 122 may comprise a volatile memory, such as random access memory (RAM), for use as temporary memory.
  • the storage 122 may comprise non-volatile memory such as Flash, read only memory (ROM), or electrically erasable programmable ROM (EEPROM), for storing data (such as the sensor data received from the apparatus 102), programs, or instructions, for example.
  • the remote server 120 may create visualisations and plots of sensor data, and may send these to a user device or any third party device for visualisation purposes.
  • the system 100 may comprise a user device 110.
  • the user device 110 may be any type of electronic device, such as, for example, a smartphone, a mobile computing device, a laptop, tablet or desktop computer, or a mobile or portable electronic device.
  • the user device 110 may be a dedicated user device that is specifically for use with the apparatus 102.
  • the user device 110 may be a non-dedicated user device, such as a smartphone.
  • the user device 110 may comprise a software application ('app') 112 which is associated with the system 100.
  • the app 112 may be launched or run when the user puts on the apparatus 102. For example, when the user is about to begin exercising, the user may put on the apparatus 102 and run the app 112.
  • the app 112 may comprise a 'record' or 'start' function, which the user may press/engage when they want to start measuring their breathing using the apparatus 102.
  • the app 112 may communicate with the apparatus 102 to instruct the sensor 104 to begin sensing and/or to instruct the communication module 106 to begin transmitting sensor data. Additionally or alternatively, when the user presses 'record' or 'start' on the app 112, the app 112 may prepare to receive sensor data from the apparatus 102.
  • the app 112 may display the sensor data as it is received from the apparatus 102. Additionally or alternatively, the app 112 may display the generated breathing pattern produced by the remote server 120.
  • the user device 110 may comprise a user interface 114 to display, for example, the app 112, sensor data, generated breathing pattern, and/or any other information.
  • the user interface 114 may be the display screen of a smartphone for example.
  • the user device 110 may comprise a processor 116 to control various processing operations performed by the user device, such as communicating with other components in system 100.
  • the processor 116 may comprise one or more of: a microprocessor, a microcontroller, and an integrated circuit.
  • the user device 110 may comprise a communication module 118.
  • the communication module 118 may receive the sensor data from the communication module 106 of the apparatus 102.
  • the communication module 118 may be able to communicate with the remote server 120, i.e. to transmit the received sensor data to the remote server 120 for processing/analysis.
  • the communication module 118 may be able to receive data from the remote server 120.
  • the communication module 118 may receive a generated breathing pattern in real-time, near real-time or after the sensing performed by sensor 104 has ended.
  • the generated breathing pattern may be displayed on the user interface 114 (e.g. in via app 112). That is, in some cases, sensor data may be transmitted from the apparatus 102 in real-time (e.g.
  • the user device 110 may transmit the sensor data to the remote server, and receive a generated breathing pattern back from the remote server 120, which the user device 110 may display.
  • the user device 110 may also receive, for example, the at least one breathing characteristic from the remote server 120 as the characteristic, and may also display the at least one breathing characteristic. It may be possible for the user of user device 110 to see the raw sensed data on the user device and see how, in real-time, the remote server is generating the breathing pattern from the raw data.
  • the communication module 118 may have at least the same communication capability as the communication module 106 of the apparatus 102, and the remote server 120.
  • the communication module 118 may use the same or different communication techniques or protocols to communicate with the communication module 106 and remote server 120.
  • the communication module 118 may be able to communicate using a wireless communication protocol, such as WiFi, hypertext transfer protocol (HTTP), a wireless mobile telecommunication protocol, short range communication such as radio frequency communication (RFID) or near field communication (NFC), or by using the communication protocols specified by ZigBee, Thread, Bluetooth, Bluetooth LE, Z-Wave, IPv6 over Low Power Wireless Standard (6L0WPAN), Long Range Wide Area Network (LoRaWAN), Low-power Wide-area network (LPWAN), Constrained Application Protocol (CoAP), SigFox, or WiFi-HaLow.
  • a wireless communication protocol such as WiFi, hypertext transfer protocol (HTTP), a wireless mobile telecommunication protocol, short range communication such as radio frequency communication (RFID) or near field communication (N
  • the communication module 118 may use a wireless mobile (cellular) telecommunication protocol to communicate with remote machines, e.g. 3G, 4G, 5G, etc.
  • the communication module 118 may use a wired communication technique to receive sensor data from the apparatus 102, such as via metal cables (e.g. a USB cable) or fibre optic cables.
  • the communication module 118 may use more than one communication technique to communicate with other components in the system 100.
  • Figure 1 shows system 100 as having a single remote server 120. It will be understood that the system 100 may have multiple servers 120. One or more of the servers 120 may be used to collect, process and store data collected from multiple apparatuses 102. One or more of the servers 120 may be private servers or dedicated servers to ensure the sensor data is stored securely. For example, if apparatuses 102 are used in a hospital, it may be preferable for the sensor data to be collected, processed and stored by a dedicated server within the hospital, to ensure patient privacy and data confidentiality. In this case, the system 100 may comprise a router 128 for receiving sensor data from each apparatus 102 within the hospital and transmitting this to the dedicated server 120.
  • the system 100 may comprise a user interface 126 (e.g. a display screen) on hospital equipment or a third party device 124, such as a computer at a nurses' station or a tablet holding an electronic patient record, or a device belonging to a clinician or physiotherapist, or a device belonging to a personal trainer.
  • a user interface 126 e.g. a display screen
  • hospital equipment e.g. a computer at a nurses' station or a tablet holding an electronic patient record, or a device belonging to a clinician or physiotherapist, or a device belonging to a personal trainer.
  • a third party device 124 such as a computer at a nurses' station or a tablet holding an electronic patient record, or a device belonging to a clinician or physiotherapist, or a device belonging to a personal trainer.
  • This may again ensure that the patient data is kept secure within the hospital itself.
  • System 100 may be used by a personal trainer to monitor patient health, and in this case, the third party device 124 may belong to a personal trainer.
  • the personal trainer may be able to see the user's breathing pattern during a personal training session.
  • the third party device 124 may be able to display, in real-time, group workout data, i.e. breathing patterns for multiple users in a group exercise session.
  • the group workout data could be used to provide live dashboards showing a ranking, based on the breathing patterns, of each user in the group.
  • raw sensor data collected by sensor 104 may be transmitted to the user device 110, and the user device 110 may transmit the sensor data to the remote server 120 for processing.
  • Algorithms, code, routines or similar for smoothing the raw sensor data to generate a breathing pattern may be stored in the remote server 120 (e.g. in storage 122) and run by processor 123.
  • the remote server 120 may also use the generated breathing pattern to determine one or more breathing characteristics, and the algorithms or techniques to determine the breathing characteristics may also be stored on the remote server 120.
  • the results of the analysis (e.g. the breathing pattern and/or the breathing characteristics) may be transmitted by the remote server 120 back to the user device 110 for display via a user interface 114 (e.g. via an app on a display screen).
  • the remote server 120 may use additional input data 130 to generate the breathing pattern and determine the at least one breathing characteristic.
  • the additional input data may comprise one or more of: geographical location of the user, altitude data, weather data, humidity data, air quality index, pollution data, pollen data, and oxygen level.
  • the additional input data 130 may be received or pulled in from public sources or websites, such as openweathermap.org.
  • the remote server 120 may determine a baseline humidity using the humidity data and may use the baseline humidity to generate the breathing pattern. Knowing the humidity of the environment in which the user is located may enable the breathing pattern to be generated more accurately.
  • the geographical location or altitude data may enable the user's breathing pattern to be analysed in the context of the air pressure in their environment. For example, if a user who normally lives close to sea level is wearing the apparatus in the mountains or at a higher altitude where the air pressure is lower, the user's breathing pattern may indicate that they are breathing at a higher rate. However, by knowing that the user is at a higher altitude, the change in their breathing rate may be considered acceptable and not a cause for concern. However, if the user's breathing rate increased while they were at home/close to sea level, then the change may be considered a cause for concern.
  • the accelerometer data collected by accelerometer 133 in the apparatus 102 may contribute to the generation of a breathing pattern and/or the determination of the at least one breathing characteristic.
  • the accelerometer data may be mapped to or matched to the generated breathing pattern by the remote server 120, which may enable analysis of the user's health or fitness while sedentary, walking and/or exercising to be determined. This may enable information to be provided to the user on how their exercise regime could be changed to improve their health, fitness or performance.
  • the remote server 120 may use data from the accelerometer 133 to generate the breathing pattern and determine the at least one breathing characteristic.
  • Figure 2 shows a flowchart of example steps performed by the at least one remote processor of the health monitoring system 100.
  • the method performed by the at least one remote processor may comprise receiving sensor data from an apparatus 102, the sensor data being the sensed breathing of a user wearing the apparatus 102 (step S100).
  • the remote processor(s) may smooth the sensor data to generate a breathing pattern.
  • the remote processor may determine at least one breathing characteristic from the breathing pattern.
  • the remote processor may use the at least one breathing characteristic to determine an indication of user health (step S106).
  • the indication of the health of the user may comprise information on the user's fitness.
  • the indication of the health of the user may comprise information on whether the user's fitness has improved since the apparatus was last worn, or over a predetermined time (e.g. over the last 3 months). This may help the user to determine whether a new exercise regime is effective, for example.
  • the fitness information could be used by a personal trainer to devise or modify an exercise regime for the user.
  • the indication of the health of the user may comprise information on their lung capacity or lung health, and/or whether their lung capacity/health has improved since the apparatus was last worn, or over a predetermined time (e.g. over the last 3 months).
  • This information may help the user or a doctor/physician to determine if the user's lung health is improving or deteriorating following a respiratory illness, disease or surgery, or following the user quitting smoking (or switching from cigarettes to e-cigarettes).
  • the indication of the health of the user may comprise information on whether the user's breathing has changed suddenly or unexpectedly (e.g. increase or decrease in breathing rate, or an increase or decrease in inhalation/exhalation depth or volume - e.g. deeper or shallower breaths). This may be useful in a hospital, as it may enable changes in the user's breathing to be identified and actioned early.
  • the remote processor may transmit data on the user's fitness (step S108).
  • the remote processor may transmit a message to a hospital device 124 warning of the deteriorating health or condition of the user (step S110) if the user's breathing has changed suddenly.
  • FIG 3A shows example sensor data 300 sensed by the health monitoring system of Figure 1 after peak detection has been performed.
  • the sensor data may be conductivity changes over time, but this may depend on the type of sensor 104 in the apparatus 102.
  • the remote server may generate a breathing pattern by: identifying a plurality of inflection points (i.e. turning points on a continuous plane curve) in the sensor data, and classifying each inflection point as a local maximum 302 or a local minimum 304. This process may be called "peak detection". All of the inflection points in the data are detected/identified by determining if a data point N is greater or lesser in value than a data point that is immediately before data point N (i.e.
  • each inflection point 302, 304 that has been identified and classified may be saved in storage 122.
  • the remote server 120 may determine whether each identified inflection point 302, 304 is indicative of a breathing pattern or of noise.
  • an inflection point is indicative of noise, it needs to be removed or ignored in order to generate a breathing pattern that accurately reflects the user's breathing. For example, if a consecutive peak and trough have a low amplitude, they may represent noise rather than breathing data and therefore need to be ignored when generating a breathing pattern from the sensor data. In other words, peaks and troughs that have low prominence may be removed. This process may be called "peak prominence detection".
  • Figure 3B shows the sensor data of Figure 3A after peak prominence detection has been performed.
  • the remote server 120 may determine whether a distance 312 between two adjacent inflection points, where one of the two inflection points is classified as a local maximum and another of the two inflection points is classified as a local minimum (i.e.
  • the threshold distance may be any suitable value which indicates that the amplitude or distance between a peak and trough is not representative of a breath.
  • analogue sensor values e.g. voltages
  • the threshold distance may be 1000. More specifically, if a sensor's values range from 0V to 3.3V, the digital values may range from 0 to 65535 if a 16-bit ADC has been used for the conversion, and the threshold distance may be 1000. In other words, the distance between a successive peak and trough must be more than 1000.
  • the peaks and troughs in region 314 of the sensor data have been determined to not be representative of breathing. As shown in Figure 3B, the peaks and troughs in region 314 are no longer marked/tagged as inflection points, and so will not be used to generate a breathing pattern.
  • the threshold distance may vary depending on, for example, the sensor and external/environmental conditions.
  • the threshold distance value may be calculated based on an initial calibration of the sensor (to account for variability between sensors), and the modified/adjusted based on sensor data to account for environmental changes. That is, the threshold distance may be based on a calibration performed for each sensor, such that the threshold distance may vary for individual sensors.
  • the remote server may also look at whether the time between two breaths are too close together to be representative of a real breathing pattern. For example, if two peaks or two troughs are close together, they may not represent a real breathing pattern as the short time between the successive peaks/troughs mean there is not enough time for inhalation or exhalation. Thus, the remote server may determine whether a time between two successive inflection points each classified as a local maximum (i.e. two adjacent peaks), or between two successive inflection points each classified as a local minimum (i.e. two adjacent troughs), is less than a predetermined time.
  • a local maximum i.e. two adjacent peaks
  • two successive inflection points each classified as a local minimum i.e. two adjacent troughs
  • FIG. 3C shows the sensor data of Figure 3B after peak separation analysis has been performed.
  • the two adjacent troughs or local minima 304a, 304b are considered to be too close together and not representative of a real breathing pattern. Consequently, as shown in Figure 3C, point 304b is no longer marked/tagged as an inflection point, and so will not be used to generate a breathing pattern.
  • an appropriate predetermined time so as to not loose real breathing pattern data. For example, if a user is breathing rapidly or hyperventilating, then the user's breaths may be naturally close together.
  • the predetermined time may be 0.6 seconds, which would equate to 100 breaths in a minute, or 0.7 seconds, which would equate to 86 breaths per minute.
  • Such high breathing rates never occur in humans, and therefore, may be sufficient to remove peaks that are not representative of a real breathing pattern while also catching rapid breathing/hyperventilation.
  • a breathing pattern should be an alternating sequence of peaks and troughs, i.e. an alternating sequence of inhalation and exhalation.
  • the remote server may generate a breathing pattern by identifying consecutive inflection points that are both classified as a local maximum or as a local minimum, and removing one of the two consecutive inflection points.
  • Figure 3D shows an example generated breathing pattern 350 after this process has been performed.
  • two adjacent inflection points that were classified as local maxima 302a, 302b do not have a local minimum between them. Accordingly, as shown in Figure 3D, inflection point 302b is no longer marked/tagged as an inflection point.
  • the resulting data 350 is the generated breathing pattern which shows a series of peaks and troughs that are representative of breathing (i.e. noise has been removed).
  • one or more breathing characteristics may be derived or determined.
  • a number of breathing characteristics may be determined, such as, for example:
  • Inhalation Speed e.g. the amplitude of a peak minus the amplitude of a neighbouring/consecutive trough, divided by inhalation time
  • Inhalation Time e.g. the time between a consecutive peak and a trough in the breathing pattern
  • Other characteristics relating to either the user activity or the sensor itself may be determined from the original sensor data and/or the breathing pattern, and/or other data may be collected from the apparatus 102 or user device 110, such as, for example:
  • Resistance training levels of a resistance training mask worn by the user (where the level may be obtained directly from the apparatus or via a user input on an app on the user device 110). For example, it may be determined that a user takes more breaths when using a training mask at one resistance level compared to when they use the training mask set at another resistance level.
  • Signal depth - which may be indicative of shallow breathing, deep breaths, etc.
  • One way to determine shallow or deep breaths based on signal depths is as follows: Since signal depth is sensor- and environment-dependent, after a period of sensor stabilisation (e.g. a minute), an average of the previous N breaths (e.g. 20 breaths) is taken. Each new breath is compared with this average. If the new breath depth is above or below the average by a certain percentage or amount, then this new breath may be classified as a deep or a shallow breath, respectively. This value is then added to the average, while the oldest breath is removed from the average, creating a moving window of N breaths. The next breath is then analysed in the same way, and the average modified accordingly. In the case where the depth is much higher or lower than the average value, then the depth value in this case would not be included in the average to ensure the average is not biased by such outliers.
  • Exertion score e.g. how hard is the user exercising or breathing based on the generated breathing pattern (and relative to e.g. historical breathing patterns/characteristics collected during past exercise sessions)
  • the Exertion Score may be used to measure the intensity of a user's exercise activity.
  • the Exertion Score may be a numerical score or value ranging from 0 to 10.
  • the table below provides example scores and their meaning:
  • the first example technique in Table 2 above for generating an exertion score can be performed in real-time, i.e. synchronously with the sensor data collection.
  • the first example technique determines the exertion score (ES) by multiplying the breaths per minute (BPM), as determined from the sensor data, with a rate of increase of the signal from the sensor when a user exhales on the sensor (OutSpeed). This is multiplied by a constant (Scaling) to generate an exertion score that is in a range between, for example, 0 to 100 or 0 to 10 (as in Table 1 above).
  • the second example technique in Table 2 above for generating an exertion score can be performed both synchronously and asynchronously.
  • the second example is an extension of the first example technique in the table above.
  • the second technique determines the exertion score (ES) by applying to the breaths per minute (BPM) as determined from the sensor data, a function f(BPM) to generate an exertion score that is in a range between 0 to 10.
  • Figures 4A and 4B show, respectively, data used to determine a technique for generating an exertion score from sensor data in real-time (synchronously) or asynchronously.
  • Figure 4A shows breaths per minute against exertion score, as the f(BPM) scales the BPM between 0 and 10.
  • a breaths per minute value of 20 corresponds to an exertion score of 1 (little exertion), while a breaths per minute value of 50 corresponds to an exertion score of 8. From this, various models were trialled to determine which model best fit the data in the graph.
  • a Gaussian model was chosen and thus, the constants in Table 2 above (for technique 2) were obtained by fitting the Gaussian model to the data.
  • the function is termed f(BPM).
  • the third technique in Table 2 above generates an exertion score by normalising breaths per minute data obtained throughout a workout using the highest BPM value.
  • the normalised value is then inputted into the equation shown in the table above to calculate the exertion score. This process is described by Nicolo et al in “Respiratory Frequency during Exercise: The Neglected Physiological Measure” (Frontiers in Physiology, December 2017, Vol. 8, Article 922).
  • the fourth technique in Table 2 above can be used to generate an exertion score from sensor data asynchronously.
  • Figure 4C shows data relating to this technique. Specifically, the top graph shows a distribution plot of how much time a user spends at each BPM. A reason why high BPMs of 70 and 80 appear on the distribution plot is because a window of 10 seconds is being used, and a person can breathe at what would be 70BPM for 10 seconds easily, but maintaining this rate for 60 seconds or more would be much harder.
  • This means the scaling function, which is a function of duration at each BPM is also a function of the window size for which BPMs have been calculated.
  • the scaling coefficients may be calculated manually, or preferably, may be calculated using a machine learning model which has been trained on breathing data from many people who also provide their perceived exertion score.
  • the BPM distribution is then scaled, where higher BPMs are scaled-up and lower BPMs are scaled-down (as described above).
  • the distribution profile is replotted, as shown in Figure 4C (bottom figures).
  • the scaling function may be called f(BPM_duration, windowSize).
  • the area under the scaled distribution profile is then calculated as the exertion score. This area is determined by calculating the sum(f(BPM_duration, windowSize)*BPM_duration). For intense workouts, the area may have values reaching 90-100 (which is then scaled by a further 1/10 to be in the range of 0 to 10 as per Table 1 above), while for non-workouts the exertion score may be below 0 due to the scaling (and negative scaling).
  • Negative scaling means that a user's resting periods during a workout, for example at around 10 to 15 BPMs for some people, would be subtracted from the total exertion score. This means if someone rests too long during exercise the exertion score will be lowered. If the overall exertion score is negative this may be capped to zero. This may occur if an activity was recorded while doing a simple walk or sitting at the desk.
  • FIG. 5 shows a flowchart of example steps to determine an exertion score.
  • the process may begin by receiving breathing characteristic data after an exercise session has been completed (step S500).
  • a distribution profile may be calculated using the breathing characteristic data (step S502).
  • An algorithm or technique, such as one described above, may be used to determine a scaling factor (step S504).
  • the scaling factor may be used to scale the distribution profile, as described above and the scaled distribution profile may be used to determine an exertion score (S506).
  • the exertion score may be transmitted to the user or to a clinician or personal trainer, for example.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Animal Behavior & Ethology (AREA)
  • Public Health (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Veterinary Medicine (AREA)
  • General Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Physiology (AREA)
  • Signal Processing (AREA)
  • Pulmonology (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Psychiatry (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)

Abstract

Broadly speaking, embodiments of the present techniques provide a health monitoring system which comprises a sensor for sensing breathing of a user and a processor for smoothing the received sensor data to generate a breathing pattern.

Description

Systems and Methods for Analysing Breathing
The present techniques generally relate to systems, apparatus and methods for monitoring health, and in particular relate to monitoring health by sensing and analysing breathing.
Breathing is difficult to measure, but breathing rate has been described as one of the most sensitive and important indicators of the deterioration of patient health. However, generally speaking, in hospitals breathing rate is monitored by occasional visual assessment, e.g. by observing the rise and fall of a patient's chest for 30 seconds every 12 hours. As well as being time-consuming, qualitative and highly prone to human error, some medical cases where breathing rate, and changes in breathing rate, have not been observed have led to avoidable patient death.
Therefore, there is a desire to provide an improved system and method for monitoring health by sensing and analysing breathing.
Background information can be found in the following patent literature: GB2550833 which describes techniques for analysing breathing data of the type obtained using structured light plethysmography (i.e. by projecting a pattern of light onto a patient) to identify breaths representing a breathing pattern over time; W02012/007719 which describes a method of identifying a breath by sensing a signal generated by a human; EP0956820 which describes breath analysis apparatus that comprises a spirometer and breath tube; US2013/079656 which describes extracting respiratory information from a photoplethysmograph signal; JP2019141597 which describes a signal processing device; and W02014/128090 which describes a respiration monitoring system that can be adhered to a patient's torso.
In a first approach of the present techniques, there is provided a health monitoring system comprising: an apparatus comprising: a sensor for sensing breathing of a user using the apparatus, and a communication module for transmitting sensor data; and at least one remote processor for: receiving sensor data from the apparatus; and smoothing the received sensor data to generate a breathing pattern. The sensor data collected by the sensor may be noisy and may need to be processed (i.e. smoothed) in order to generate a breathing pattern that accurately reflects the user's breathing.
In a second approach of the present techniques, there is a method for health monitoring, comprising: sensing, using an apparatus, breathing of a user wearing the apparatus; generating a breathing pattern from the sensed data; and determining from the breathing pattern at least one breathing characteristic.
Preferred features are set out in the dependent claims and described below. Preferred features described below apply equally to both the first and second approach.
Once the breathing pattern has been generated, at least one breathing characteristic may be determined or derived from the breathing pattern.
The at least one remote processor may be located in any component in the system that is remote to the apparatus used/worn by the user. Multiple remote processors may be used to perform the smoothing of the sensor data (and to thereby generate a breathing pattern from the sensor data, and perform any other processing such as determining breathing characteristics), which may be located in the same or different components in the system. For example, the at least one remote processor may be in a user device and/or in a remote server. In some cases, one or more of the steps to smooth the sensor data may be performed by the processor(s) in the user device, and one or more steps may be performed by the processor(s) of the remote server.
The at least one remote processor may determine an indication of the health of the user from the at least one breathing characteristic.
The health monitoring system may be used in a variety of contexts. For example, the health monitoring system may be used by a user to monitor their own health. A user may monitor their breathing during exercise, in which case the indication of the health of the user may comprise information on the user's fitness. The indication of the health of the user may comprise information on whether the user's fitness has improved since the apparatus was last worn/used, or over a predetermined time (e.g. over the last 3 months). This may help the user to determine whether a new exercise regime is effective, for example. The fitness information could be used by a personal trainer to devise or modify an exercise regime for the user.
In another example, a user may monitor their breathing while resting or stationary to determine their health, lung capacity or lung health. In this case, the indication of the health of the user may comprise information on their lung capacity or lung health, and/or whether their lung capacity/health has improved since the apparatus was last worn/used, or over a predetermined time (e.g. over the last 3 months). This information may help the user or a doctor/physician to determine if the user's lung health is improving or deteriorating following an illness, a respiratory illness (such as that caused by COVID-19), disease or surgery, or following the user quitting smoking (or switching from cigarettes to e- cigarettes), or to monitor the health of a user with a chronic condition such as cystic fibrosis or chronic obstructive pulmonary disease (COPD).
In another example, a user admitted to a hospital may wear/use the apparatus so that doctors and nurses in the hospital may monitor the user's breathing more regularly, remotely and without human involvement. This advantageously increases the chances of detecting changes in the user's breathing to be identified and actioned early, and reduces the risk of human error.
The system may further comprise at least one user interface for displaying the indication of the health of the user. The user interface may be provided in any suitable manner or on any suitable device. For example, if the system is being used by a user to monitor themselves, the user interface may be the display screen of an electronic device used by the user, such as a computer or smartphone. If the system is being used by a hospital to monitor patient health, the user interface may be the display screen on hospital equipment, such as a computer at a nurses' station or a tablet holding an electronic patient record. In this case, the raw sensor data and/or the processed data (e.g. the breathing pattern) may be anonymised. The anonymised data may be associated with a Unique Identifier (UID), which may link the processed data to a patient's personal health records, which may be stored by the hospital in the hospital's own secure servers for patient data privacy.
Thus, the at least one user interface may be on a user device, and the indication of the health of the user may comprise information on the user's fitness. Additionally or alternatively, the at least one user interface may be on a device in a hospital, and the indication of the health of the user may comprise a warning that the user's health is deteriorating. This may advantageously enable hospital staff to take action sooner than if breathing is monitored by infrequent observation.
As mentioned above, on a general ward in a hospital, breathing rate may be monitored through visual assessment every 12 hours, while in intensive care units (ICUs), specialist capnography devices may be used to monitor the concentration or volume of carbon dioxide exhaled by a patient. Electronic techniques for measuring breathing is limited to piezoelectric sensors that measure chest wall movement, impedance measurements that measure changes in chest conductivity during inhalation and exhalation, and microphones to detect the sound of the lungs expanding and contracting. However, these techniques suffer from a low signal to noise ratio and may require significant filtering to accurately determine breathing rate.
In the present techniques, the remote processor(s) may smooth the received sensor data to generate a breathing pattern by: identifying a plurality of inflection points (i.e. turning points on a continuous plane curve) in the sensor data, and classifying each inflection point as a local maximum or a local minimum. This process may be called "peak detection". All of the inflection points in the data are detected/identified by determining if a data point N is greater or lesser in value than a data point that is immediately before data point N (i.e. N-l) and a data point that is immediately after data point N (i.e. N + l). In the case where data point N is greater than both data points N-l and N + l, the data point N is identified as an inflection point and is classified as a local maximum (or peak). In the case where data point N is lesser than both data points N-l and N + l, the data point N is identified as an inflection point and is classified as a local minimum (or trough). Each inflection point that has been identified and classified may be saved in storage.
The at least one remote processor may determine whether each identified inflection point is indicative of a breathing pattern or of noise. If an inflection point is indicative of noise, it may need to be removed or ignored in order to smooth the sensor data and generate a breathing pattern that accurately reflects the user's breathing. (Preferably, the data point is simply ignored in subsequent processing). For example, if a consecutive peak and trough have a low amplitude, they may represent noise rather than breathing data and therefore need to be ignored when generating a breathing pattern from the sensor data. In other words, peaks and troughs that have low prominence may be removed. This process may be called "peak prominence detection" or "peak prominence threshold". Thus, the remote processor(s) may determine whether a distance between two adjacent inflection points, where one of the two inflection points is classified as a local maximum and another of the two inflection points is classified as a local minimum (i.e. the distance between a peak and a trough), is above a threshold distance. If the distance is below the threshold distance, the remote processor(s) may remove both of the two adjacent inflection points. The threshold distance may be any suitable value which indicates that the amplitude or distance between a peak and trough is not representative of a breath. For example, in cases where analogue sensor values (e.g. voltages) have been converted into digital values, the threshold distance may be 1000. More specifically, if a sensor's values range from 0V to 3.3V, the digital values may range from 0 to 65535 if a 16-bit ADC has been used for the conversion, and the threshold distance may be 1000. In other words, the distance between a successive peak and trough must be more than 1000. This may be a reasonable threshold distance, since in a normal breathing pattern successive peaks and troughs are usually separated by 10,000, and even if a user takes shallow breaths, the distance is more than 1000.
It will be understood that this an example, non-limiting threshold distance. It will be understood that the threshold distance may vary depending on, for example, the sensor and external/environmental conditions. Thus, more generally, the threshold distance value may be calculated based on an initial calibration of the sensor (to account for variability between sensors), and the modified/adjusted based on sensor data to account for environmental changes. That is, the threshold distance may be based on a calibration performed for each sensor, such that the threshold distance may vary for individual sensors.
The remote processor(s) may also look at whether the time between two breaths are too close together to be representative of a real breathing pattern. For example, if two peaks or two troughs are close together, they may not represent a real breathing pattern as the short time between the successive peaks/troughs mean there is not enough time for inhalation or exhalation. Thus, the remote processor(s) may determine whether a time between two successive inflection points each classified as a local maximum (i.e. two adjacent peaks), or between two successive inflection points each classified as a local minimum (i.e. two adjacent troughs), is less than a predetermined time. When the time is less than a predetermined time, one of the two inflection points may be removed by the remote processor(s) so that it is not used to generate the breathing pattern. This process may be called "peak separation analysis" or "peak distance analysis". However, it is important to choose an appropriate predetermined time so as to not loose real breathing pattern data. For example, if a user is breathing rapidly or hyperventilating, then the user's breaths may be naturally close together. The predetermined time may be 0.6 seconds, which would equate to 100 breaths in a minute, or 0.7 seconds, which would equate to 86 breaths per minute. Such high breathing rates never occur in humans, and therefore, may be sufficient to remove peaks that are not representative of a real breathing pattern while also catching rapid breathing/ hyperventilation/ tachypnoea.
Additionally or alternatively, the peak separation analysis may take into account information on the activity that the user is performing/undertaking while using the apparatus. That is, knowledge of this activity (e.g. sitting, running, walking, etc.) could be used to determine the peak separation analysis, as different activities may be associated with different peak separations (or number of breaths per minute). This means knowledge of different breathing rates associated with different activities could be used to perform the peak separation analysis. A breathing pattern should be an alternating sequence of peaks and troughs, i.e. an alternating sequence of inhalation and exhalation. Thus, if two adjacent inflection points are both classified as a local maximum or as a local minimum, such that there are two peaks next to each other without a trough in- between, or two troughs without a peak in-between, then the sensor data does not represent an alternating sequence of inhalation and exhalation. Accordingly, the remote processor(s) may generate a breathing pattern by identifying consecutive inflection points that are both classified as a local maximum or as a local minimum, and removing one of the two consecutive inflection points. This process may be called "consecutive peaks or troughs detection or elimination".
It will be understood that the breathing pattern may be generated in real time. That is, the breathing pattern may be generated as soon as at least two peaks and a trough are received, or two troughs and one peak are received. In some cases, a single peak and trough (or trough and peak) may be sufficient to generate the breathing pattern. It will be understood that this enables real-time analysis to be performed. More data (e.g. more peaks and troughs) may enable a more accurate breathing pattern to be produced, and more accurate breathing characteristics (see below) to be determined. Thus, a breathing pattern and breathing characteristic may be generated/determined in real-time but may also change in real-time as more data is received.
Once a breathing pattern has been generated, the remove processor(s) may determine at least one breathing characteristic from the generated breathing pattern. There are a number of breathing characteristics that may be derived from the breathing pattern, and which may be used to provide feedback on the user's health or fitness. For example, the at least one breathing characteristic may be any one or more: inhalation speed, exhalation speed, inhalation to exhalation ratio, number of breaths per minute (which could be used to detect hyperventilation, hypocapnia, hypoventilation, hypercapnia, etc.), average breathing rate when wearing a resistive sports mask or resistive respiratory muscle training device (which may depend on the restriction level of the resistive sports mask), exertion score, and depth or volume of inhalation or exhalation (which may be indicative of lung capacity or fitness). There may be a correlation between inhalation/exhalation speed, inhalation time and exhalation time, and flow rate, and therefore the depth or volume of inhalation/exhalation.
When the breathing characteristic is inhalation speed, the remote processor(s) may determine the inhalation speed from the breathing pattern by: measuring a time between an inflection point classified as a local maximum and a subsequent inflection point classified as a local minimum; and dividing a distance between the inflection point classified as a local maximum and the subsequent inflection point classified as a local minimum by the measured time. In the example data shown in Figures 3A to 3D, the sensor measures conductivity as a function of time. The conductivity may represent, in some cases, the changes in humidity as a user of the apparatus breathes in and out. Thus, the decrease in conductivity over time between a peak and a trough may be indicative of a decrease in humidity over time as the user takes a breath (i.e. the inhalation speed). The conductivity may represent, in some cases, the changes in temperature and/or change in pressure as a user breathes in and out. For example, an increase in pressure or temperature may occur on exhalation, and a decrease in pressure or temperature may occur on inhalation. Exactly what the conductivity represents may depend on the type of sensor used.
When the breathing characteristic is exhalation speed, the remote processor(s) may determine the exhalation speed from the breathing pattern by: measuring a time between an inflection point classified as a local minimum and a subsequent inflection point classified as a local maximum; and dividing a distance between the inflection point classified as a local minimum and the subsequent inflection point classified as a local maximum by the measured time. In the example data shown in Figures 3A to 3D, the sensor measures conductivity as a function of time. The conductivity may represent, in some cases, the changes in humidity as a user of the apparatus breathes in and out. Thus, the increase in conductivity over time between a trough and a peak may be indicative of an increase in humidity over time as the user exhales (i.e. the exhalation speed). The conductivity may represent, in some cases, the changes in temperature and/or change in pressure as a user breathes in and out. Exactly what the conductivity represents may depend on the type of sensor used. When the breathing characteristic is a ratio of inhalation to exhalation, the remote processor(s) may determine the ratio from the breathing pattern by: dividing the inhalation speed by the exhalation speed, or preferably by dividing the inhalation time by the exhalation time.
When the breathing characteristic is a breathing rate, the remote processor(s) may determine the breathing rate from the breathing pattern by: determining the number of inflection points classified as a local maximum in a minute. Determining the number of inflection points classified as a local maximum in a minute may comprise determining, when the breathing pattern is longer than a minute, an average number of inflection points classified as a local maximum in a minute. Alternatively, determining the number of inflection points classified as a local maximum in a minute may comprise extrapolating, when the breathing pattern is less than a minute, the breathing rate based on the number of inflection points in a duration of the breathing pattern.
Additionally or alternatively, the breathing rate may be extrapolated at a resolution of half-breath. However, a half-breath resolution may not be useful for calculating breathing rate since inhalation and exhalation time are often different. Nevertheless a half breath resolution may be useful to detect rapid changes in breathing rate that may be clinically relevant (e.g. coughing or a panic attack). These rapid changes would be averaged out if using a BPM calculated over a one- minute window. For example, if a person is coughing, their BPM may be 60 or more using a half breath resolution, but that is not indicative of their average BPM - rather, it is indicative of a specific, short-time event. A resolution of one breath may be the smallest resolution from which an accurate breathing rate can be accurately calculated. This is calculated using the time between two peaks or the time between two troughs. A breathing rate calculated using a one breath resolution may also need to be used carefully, as it may not be averaged to know the average breathing rate in a specific time interval - this usually requires performing the above-described technique (i.e. counting the number of breaths with a resolution of half a breath, and then dividing for that time interval to get the breathing rate). When the breathing characteristic is inhalation depth, the remote processor(s) may determine the inhalation depth from the breathing pattern by: averaging a distance between an inflection point classified as a local maximum and a subsequent inflection point classified as a local minimum.
When the breathing characteristic is exhalation depth, the remote processor(s) may determine the exhalation depth from the breathing pattern by: averaging a distance between an inflection point classified as a local minimum and a subsequent inflection point classified as a local maximum. Exhalation depth may enable short breaths and long breaths to be identified, and may enable shallow breaths and deep breaths to be identified.
The at least one user interface may display information in addition to the breathing characteristic or indication of the health or fitness of the user. For example, the user interface may display one or more of: a total number of hours the user has worn the apparatus, an exertion score, an indication of the user's lung function, and information on whether the sensor needs to be replaced.
The remote processor(s) may be arranged to: compare sensor data received from the apparatus over a predetermined time period; and determine whether the accuracy of the sensor has changed over the predetermined time period. Thus, the remote server may be able to identify any changes in the sensitivity or accuracy of the sensor over time, by considering whether, for example, the maximum and minimum values sensed by the sensor have changed over time. The remote server may be able to send a message to a user device to indicate to the user (or to a hospital staff member or administrator) that the sensor needs to be replaced.
The apparatus may further comprise an accelerometer to sense movement of the user while the user is wearing the apparatus. The accelerometer data may be transmitted to the remote server along with the sensor data. The accelerometer data may contribute to the generation of a breathing pattern and/or the determination of the at least one breathing characteristic. For example, the accelerometer data may be mapped to or matched to the generated breathing pattern, which may enable analysis of the user's health or fitness while sedentary, walking and/or exercising to be determined. This may enable information to be provided to the user on how their exercise regime could be changed to improve their health, fitness or performance. Thus, the remote processor(s) may use data from the accelerometer to generate the breathing pattern and determine the at least one breathing characteristic.
The remote server may use additional input data to generate the breathing pattern and determine the at least one breathing characteristic. For example, the additional input data may comprise one or more of: geographical location of the user, altitude data, weather data, humidity data, air quality index, pollution data, pollen data, and oxygen level. The additional input data may be obtained by the remote server from external sources or third party sources. For example, the weather data may be obtained from a national weather service provider (such as the Met Office in the UK), while altitude data may be obtained from a map provider (such as via open APIs).
The remote processor(s) may determine a baseline humidity using the humidity data and may use the baseline humidity to generate the breathing pattern. Knowing the humidity of the environment in which the user is located may enable the breathing pattern to be generated more accurately. Similarly, other data such as pressure and external/environmental temperature may be used.
The geographical location or altitude data may enable the user's breathing pattern to be analysed in the context of the air pressure in their environment. For example, if a user who normally lives close to sea level is wearing the apparatus in the mountains or at a higher altitude where the air pressure is lower, the user's breathing pattern may indicate that they are breathing at a higher rate or taking deeper breaths. However, by knowing that the user is at a higher altitude, the change in their breathing rate may be considered acceptable and not a cause for concern. However, if the user's breathing rate increased while they were at home/close to sea level, then the change may be considered a cause for concern.
The sensor of the apparatus may be any one of: a thermistor, a humidity sensor, a gas sensor, a pressure sensor, a microphone, a sound sensor/detector, and a sensor comprising a porous material. It would be understood that this is an example, non-exhaustive list of possible sensors that are suitable for sensing breathing. It will also be understood that an apparatus may comprise more than one sensor, and that the sensors may be the same or different.
The apparatus may be any one of: a wearable apparatus, a resistive sports mask, an oxygen deprivation mask, an apparatus worn over a user's mouth and/or nose, a medical breath monitoring apparatus, a face mask, a disposable face mask, a personal protection equipment face mask, a surgical mask, an oxygen mask, an inhaler, an asthma inhaler, an e-cigarette, a heat moisture exchanger, and a nasal cannula. It would be understood that this is an example, non- exhaustive list of possible types of apparatus that could be used to sense breathing and monitor user health. Generally speaking, the apparatus may be any device which is able to be placed in the proximity of exhaled air or which can receive exhaled air (e.g. via tubes that direct exhaled air from the user to the apparatus).
The sensor and communication module may be removably attached to the apparatus. This may be useful if the apparatus is a disposable device such as a disposable mask, or a reusable device that is washed for reuse, such as a washable mask. The sensor and communication module may be removed before the apparatus is disposed of (enabling reuse of the sensor and communication module), or before the apparatus is washed. Alternatively, the sensor and communication module may be irremovably attached to the apparatus. This may be achieved by integrating the sensor and communication module into the apparatus, by any suitable means.
The remote processor may: determine an indication of the health of the user from the at least one breathing characteristic; and transmit the indication of the health of the user to any one or more of: a user device, a third party device, and a third party server.
The remote processor may be configured to: transmit the received sensor data to a third party device or third party server. The remote processor may be configured to: transmit the generated breathing pattern to a third party device or third party server.
The transmission in any case to any device may be in real-time. This may advantageously enable the user or a third party to see real-time data for a user, which could be particularly useful for clinicians or in a hospital.
The remote processor may be configured to: receive sensor data from a third party server for analysis; smooth the received sensor data from the third party server to generate a breathing pattern; and transmit the generated breathing pattern to the third party server. That is, third parties may send sensor data to be processed by the remote processor.
The method (e.g. performed by the remote processor) may further comprise: providing, to a user device, breathing exercises for the user to follow while using the apparatus; and analysing, using the received sensor data, user performance while undertaking the breathing exercises.
The method (e.g. performed by the remote processor) may further comprise determining an exertion score by: calculating, using the breathing characteristic, a distribution profile of the breathing characteristic; determining a scaling factor; scaling the distribution profile using the scaling factor to generate a scaled distribution profile; and determining the exertion score using the scaled distribution profile. The breathing characteristic may be breathing rate (i.e. breaths per minute or BPM).
The breathing characteristic may be a breath depth. The method may comprise determining the breath depth from the breathing pattern by: determining an average breath depth for a predetermined number of breaths; comparing a depth of a subsequent breath to the average breath depth; and classifying the subsequent breath as a shallow breath when the depth of the subsequent breath is below the average breath depth by a threshold value, or as a deep breath when the depth of the subsequent is above the average breath depth by a threshold value. The method may further comprise providing the user, using a total number of breaths recorded using the apparatus, with an indication of when to replace the sensor of the apparatus. This may be advantageous as it may indicate, ahead of the end of the lifetime of the sensor, when the sensor needs to be replaced (if it is replaceable), or when a new apparatus needs to be used.
In a related approach of the present techniques, there is provided a (non- transitory) computer readable medium carrying processor control code which when implemented in a system causes the system to carry out any of the methods, processes and techniques described herein.
As will be appreciated by one skilled in the art, the present techniques may be embodied as a system, method or computer program product. Accordingly, present techniques may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects.
Furthermore, the present techniques may take the form of a computer program product embodied in a computer readable medium having computer readable program code embodied thereon. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable medium may be, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing.
Computer program code for carrying out operations of the present techniques may be written in any combination of one or more programming languages, including object oriented programming languages and conventional procedural programming languages. Code components may be embodied as procedures, methods or the like, and may comprise sub-components which may take the form of instructions or sequences of instructions at any of the levels of abstraction, from the direct machine instructions of a native instruction set to high-level compiled or interpreted language constructs. Embodiments of the present techniques also provide a non-transitory data carrier carrying code which, when implemented on a processor, causes the processor to carry out any of the methods described herein.
The techniques further provide processor control code to implement the above-described methods, for example on a general purpose computer system or on a digital signal processor (DSP). The techniques also provide a carrier carrying processor control code to, when running, implement any of the above methods, in particular on a non-transitory data carrier. The code may be provided on a carrier such as a disk, a microprocessor, CD- or DVD-ROM, programmed memory such as non-volatile memory (e.g. Flash) or read-only memory (firmware), or on a data carrier such as an optical or electrical signal carrier. Code (and/or data) to implement embodiments of the techniques described herein may comprise source, object or executable code in a conventional programming language (interpreted or compiled) such as C, or assembly code, code for setting up or controlling an ASIC (Application Specific Integrated Circuit) or FPGA (Field Programmable Gate Array), or code for a hardware description language such as Verilog (RTM) or VHDL (Very high speed integrated circuit Hardware Description Language). As the skilled person will appreciate, such code and/or data may be distributed between a plurality of coupled components in communication with one another. The techniques may comprise a controller which includes a microprocessor, working memory and program memory coupled to one or more of the components of the system.
It will also be clear to one of skill in the art that all or part of a logical method according to embodiments of the present techniques may suitably be embodied in a logic apparatus comprising logic elements to perform the steps of the above-described methods, and that such logic elements may comprise components such as logic gates in, for example a programmable logic array or application-specific integrated circuit. Such a logic arrangement may further be embodied in enabling elements for temporarily or permanently establishing logic structures in such an array or circuit using, for example, a virtual hardware descriptor language, which may be stored and transmitted using fixed or transmittable carrier media. In an embodiment, the present techniques may be implemented using multiple processors or control circuits. The present techniques may be adapted to run on, or integrated into, the operating system of an apparatus.
In an embodiment, the present techniques may be realised in the form of a data carrier having functional data thereon, said functional data comprising functional computer data structures to, when loaded into a computer system or network and operated upon thereby, enable said computer system to perform all the steps of the above-described method.
Implementations of the present techniques will now be described, by way of example only, with reference to the accompanying drawings, in which:
Figure 1 shows a schematic diagram of a health monitoring system;
Figure 2 shows a flowchart of example steps performed by the health monitoring system of Figure 1;
Figure 3A shows example sensor data sensed by the health monitoring system of Figure 1 after peak detection has been performed;
Figure 3B shows the sensor data of Figure 3A after peak prominence detection has been performed;
Figure 3C shows the sensor data of Figure 3B after peak separation analysis has been performed;
Figure 3D shows an example generated breathing pattern;
Figures 4A and 4B show, respectively, data used to determine a technique for generating an exertion score from sensor data in real-time (synchronously) or asynchronously;
Figure 4C shows data used to determine another technique for generating an exertion score from sensor data asynchronously; and Figure 5 shows a flowchart of example steps to determine an exertion score.
Broadly speaking, embodiments of the present techniques provide a health monitoring system which uses user breathing data to determine a user's or patient's health.
Figure 1 shows a schematic diagram of a health monitoring system 100. The system 100 comprises an apparatus 102 and a remote processor(s) 120. The apparatus 102 may be a wireless apparatus or a wired apparatus. In other words, the apparatus 102 may be capable of wirelessly transmitting data to another device, or may need a wired connection to transmit data to another example. The apparatus 102 may comprise at least one sensor 104 for sensing breathing of a user wearing the apparatus 102. The at least one sensor 104 may be any one of: a thermistor, a humidity sensor, a gas sensor, a pressure sensor, a microphone, a sound sensor, and a sensor comprising a porous material. An example of a sensor comprising a porous material can be found in International Patent Publication No. W02016/065180 and US Patent No. US10712337. It would be understood that this is an example, non-exhaustive list of possible sensors that are suitable for sensing breathing. It will also be understood that an apparatus may comprise more than one sensor, and that the sensors may be the same or different.
The apparatus 102 may comprise a communication module 106 for transmitting sensor data. The data collected by the sensor 104 may be transmitted to an external device or server for storage and analysis. This may be advantageous because the apparatus 102 may not have the processing power or capacity to analyse the data, and/or the storage capacity to store large quantities of data. The data collected by the sensor 104 may be transmitted periodically to an external device/server, such as every second, or every few seconds, or at any suitable frequency such as, but not limited to, 12.5Flz or 20Flz. Alternatively, data collected by the sensor 104 may be transmitted at longer intervals or at irregular times in certain circumstances. For example, if the apparatus 102 is not within range to be able to communicate with an external device (e.g. is not within the range for Bluetooth (RTM) or WiFi communication), the communication module 106 may not transmit any sensor data until the external device is determined to be within range. In certain embodiments therefore, the apparatus 102 may have storage or memory 132 to temporarily store sensor data collected by the sensor 104 when real-time communication to an external device is not possible. The storage 132 may comprise a volatile memory, such as random access memory (RAM), for use as temporary memory. The storage 132 may comprise non-volatile memory such as Flash, read only memory (ROM), or electrically erasable programmable ROM (EEPROM), for storing data, programs, or instructions, for example.
The data collected by the sensor 104 may be communicated/transmitted to the remote server 120 directly, or via an intermediate device. Communicating the sensor data to an intermediate device may be preferable due to the communication capability of the communication module 106. For example, if the communication module 106 is only capable of short range communication, then the sensor data may need to be transmitted to an intermediate device which is capable of transmitting the sensor data to the remote sever. In some cases, the communication module 106 may be able to communicate directly with the remote server. In some cases, the intermediate device may process the sensor data into a format or into processed data that the remote server can handle.
The communication module 106 may be able to communicate using a wireless communication protocol, such as WiFi, hypertext transfer protocol (HTTP), a wireless mobile telecommunication protocol, short range communication such as radio frequency communication (RFID) or near field communication (NFC), or by using the communication protocols specified by ZigBee, Thread, Bluetooth, Bluetooth LE, Z-Wave, IPv6 over Low Power Wireless Standard (6L0WPAN), Long Range Wide Area Network (LoRaWAN), Low-power Wide-area network (LPWAN), Constrained Application Protocol (CoAP), SigFox, or WiFi-HaLow. The communication module 106 may use a wireless mobile (cellular) telecommunication protocol to communicate with remote machines, e.g. 3G, 4G, 5G, etc. The communication module 106 may use a wired communication technique to transfer sensor data to an intermediate/external device, such as via metal cables (e.g. a USB cable) or fibre optic cables. The communication module 106 may use more than one communication technique to communicate with other components in the system 100.
The apparatus 102 may comprise a processor or processing circuitry 108. The processor 108 may control various processing operations performed by the apparatus 102, such as communicating with other components in system 100. In some cases, the processor 108 of the apparatus 102 may simply control the operation of the sensor 104, communication module 106 and storage 132. In other cases, the processor 108 may have some further processing capability. For example, the processor 108 may comprise processing logic to processor data (e.g. the sensor data collected by sensor 104), and generate output data/signals/messages in response to the processing. The processor 108 may be able to compress the sensor data for example, to reduce the size of the data that is being transmitted to another device. The processor 108 may comprise one or more of: a microprocessor, a microcontroller, and an integrated circuit.
The apparatus 102 may optionally comprise an accelerometer 133 to sense movement of the user while the user is wearing the apparatus 102. The accelerometer data may be transmitted to an external device along with the sensor data.
Apparatus 102 may optionally comprise an interface 138 for providing feedback on a user's breathing. For example, the interface 138 may be one or more LEDs or other lights which may turn on and off according to the generated breathing pattern. This may provide a visual indicator to a user of the apparatus or a third party (e.g. a doctor or personal trainer) of the generated breathing pattern.
As mentioned above, the system 100 may comprise a remote server 120 for performing one or more of the steps involved in smoothing the sensor data received from the apparatus 102. Thus, the apparatus 102 may transmit sensor data to the remote server 120. The remote server 120 may then generate a breathing pattern from the sensor data, and determine from the breathing pattern at least one breathing characteristic. The remote server 120 may comprise at least one processor 123 and storage 122. Storage 122 may comprise a volatile memory, such as random access memory (RAM), for use as temporary memory. The storage 122 may comprise non-volatile memory such as Flash, read only memory (ROM), or electrically erasable programmable ROM (EEPROM), for storing data (such as the sensor data received from the apparatus 102), programs, or instructions, for example.
The remote server 120 may create visualisations and plots of sensor data, and may send these to a user device or any third party device for visualisation purposes.
The system 100 may comprise a user device 110. The user device 110 may be any type of electronic device, such as, for example, a smartphone, a mobile computing device, a laptop, tablet or desktop computer, or a mobile or portable electronic device. The user device 110 may be a dedicated user device that is specifically for use with the apparatus 102. Alternatively, the user device 110 may be a non-dedicated user device, such as a smartphone. In either case, the user device 110 may comprise a software application ('app') 112 which is associated with the system 100. The app 112 may be launched or run when the user puts on the apparatus 102. For example, when the user is about to begin exercising, the user may put on the apparatus 102 and run the app 112. The app 112 may comprise a 'record' or 'start' function, which the user may press/engage when they want to start measuring their breathing using the apparatus 102. The app 112 may communicate with the apparatus 102 to instruct the sensor 104 to begin sensing and/or to instruct the communication module 106 to begin transmitting sensor data. Additionally or alternatively, when the user presses 'record' or 'start' on the app 112, the app 112 may prepare to receive sensor data from the apparatus 102. The app 112 may display the sensor data as it is received from the apparatus 102. Additionally or alternatively, the app 112 may display the generated breathing pattern produced by the remote server 120.
The user device 110 may comprise a user interface 114 to display, for example, the app 112, sensor data, generated breathing pattern, and/or any other information. The user interface 114 may be the display screen of a smartphone for example. The user device 110 may comprise a processor 116 to control various processing operations performed by the user device, such as communicating with other components in system 100. The processor 116 may comprise one or more of: a microprocessor, a microcontroller, and an integrated circuit.
The user device 110 may comprise a communication module 118. The communication module 118 may receive the sensor data from the communication module 106 of the apparatus 102. The communication module 118 may be able to communicate with the remote server 120, i.e. to transmit the received sensor data to the remote server 120 for processing/analysis. The communication module 118 may be able to receive data from the remote server 120. For example, the communication module 118 may receive a generated breathing pattern in real-time, near real-time or after the sensing performed by sensor 104 has ended. The generated breathing pattern may be displayed on the user interface 114 (e.g. in via app 112). That is, in some cases, sensor data may be transmitted from the apparatus 102 in real-time (e.g. every second as it is being sensed or at a frequency of 12.5Hz) to the user device 110. The user device 110 may transmit the sensor data to the remote server, and receive a generated breathing pattern back from the remote server 120, which the user device 110 may display. The user device 110 may also receive, for example, the at least one breathing characteristic from the remote server 120 as the characteristic, and may also display the at least one breathing characteristic. It may be possible for the user of user device 110 to see the raw sensed data on the user device and see how, in real-time, the remote server is generating the breathing pattern from the raw data.
Thus, the communication module 118 may have at least the same communication capability as the communication module 106 of the apparatus 102, and the remote server 120. The communication module 118 may use the same or different communication techniques or protocols to communicate with the communication module 106 and remote server 120. The communication module 118 may be able to communicate using a wireless communication protocol, such as WiFi, hypertext transfer protocol (HTTP), a wireless mobile telecommunication protocol, short range communication such as radio frequency communication (RFID) or near field communication (NFC), or by using the communication protocols specified by ZigBee, Thread, Bluetooth, Bluetooth LE, Z-Wave, IPv6 over Low Power Wireless Standard (6L0WPAN), Long Range Wide Area Network (LoRaWAN), Low-power Wide-area network (LPWAN), Constrained Application Protocol (CoAP), SigFox, or WiFi-HaLow. The communication module 118 may use a wireless mobile (cellular) telecommunication protocol to communicate with remote machines, e.g. 3G, 4G, 5G, etc. The communication module 118 may use a wired communication technique to receive sensor data from the apparatus 102, such as via metal cables (e.g. a USB cable) or fibre optic cables. The communication module 118 may use more than one communication technique to communicate with other components in the system 100.
Figure 1 shows system 100 as having a single remote server 120. It will be understood that the system 100 may have multiple servers 120. One or more of the servers 120 may be used to collect, process and store data collected from multiple apparatuses 102. One or more of the servers 120 may be private servers or dedicated servers to ensure the sensor data is stored securely. For example, if apparatuses 102 are used in a hospital, it may be preferable for the sensor data to be collected, processed and stored by a dedicated server within the hospital, to ensure patient privacy and data confidentiality. In this case, the system 100 may comprise a router 128 for receiving sensor data from each apparatus 102 within the hospital and transmitting this to the dedicated server 120.
If the system 100 is being used by a hospital to monitor patient health, the system 100 may comprise a user interface 126 (e.g. a display screen) on hospital equipment or a third party device 124, such as a computer at a nurses' station or a tablet holding an electronic patient record, or a device belonging to a clinician or physiotherapist, or a device belonging to a personal trainer. This enables the generated breathing pattern to be displayed or recorded on, for example, hospital device 124, rather than on a user device or on another device which may leave the hospital. This may again ensure that the patient data is kept secure within the hospital itself.
System 100 may be used by a personal trainer to monitor patient health, and in this case, the third party device 124 may belong to a personal trainer. The personal trainer may be able to see the user's breathing pattern during a personal training session. Similarly, the third party device 124 may be able to display, in real-time, group workout data, i.e. breathing patterns for multiple users in a group exercise session. The group workout data could be used to provide live dashboards showing a ranking, based on the breathing patterns, of each user in the group.
In some cases, raw sensor data collected by sensor 104 may be transmitted to the user device 110, and the user device 110 may transmit the sensor data to the remote server 120 for processing. Algorithms, code, routines or similar for smoothing the raw sensor data to generate a breathing pattern may be stored in the remote server 120 (e.g. in storage 122) and run by processor 123. The remote server 120 may also use the generated breathing pattern to determine one or more breathing characteristics, and the algorithms or techniques to determine the breathing characteristics may also be stored on the remote server 120. The results of the analysis (e.g. the breathing pattern and/or the breathing characteristics) may be transmitted by the remote server 120 back to the user device 110 for display via a user interface 114 (e.g. via an app on a display screen).
The remote server 120 may use additional input data 130 to generate the breathing pattern and determine the at least one breathing characteristic. For example, the additional input data may comprise one or more of: geographical location of the user, altitude data, weather data, humidity data, air quality index, pollution data, pollen data, and oxygen level. The additional input data 130 may be received or pulled in from public sources or websites, such as openweathermap.org.
The remote server 120 may determine a baseline humidity using the humidity data and may use the baseline humidity to generate the breathing pattern. Knowing the humidity of the environment in which the user is located may enable the breathing pattern to be generated more accurately. The geographical location or altitude data may enable the user's breathing pattern to be analysed in the context of the air pressure in their environment. For example, if a user who normally lives close to sea level is wearing the apparatus in the mountains or at a higher altitude where the air pressure is lower, the user's breathing pattern may indicate that they are breathing at a higher rate. However, by knowing that the user is at a higher altitude, the change in their breathing rate may be considered acceptable and not a cause for concern. However, if the user's breathing rate increased while they were at home/close to sea level, then the change may be considered a cause for concern.
The accelerometer data collected by accelerometer 133 in the apparatus 102 may contribute to the generation of a breathing pattern and/or the determination of the at least one breathing characteristic. For example, the accelerometer data may be mapped to or matched to the generated breathing pattern by the remote server 120, which may enable analysis of the user's health or fitness while sedentary, walking and/or exercising to be determined. This may enable information to be provided to the user on how their exercise regime could be changed to improve their health, fitness or performance. Thus, the remote server 120 may use data from the accelerometer 133 to generate the breathing pattern and determine the at least one breathing characteristic.
Figure 2 shows a flowchart of example steps performed by the at least one remote processor of the health monitoring system 100. The method performed by the at least one remote processor may comprise receiving sensor data from an apparatus 102, the sensor data being the sensed breathing of a user wearing the apparatus 102 (step S100). At step S102, the remote processor(s) may smooth the sensor data to generate a breathing pattern.
Optionally, at step S104, the remote processor may determine at least one breathing characteristic from the breathing pattern.
Optionally, the remote processor may use the at least one breathing characteristic to determine an indication of user health (step S106). The indication of the health of the user may comprise information on the user's fitness. The indication of the health of the user may comprise information on whether the user's fitness has improved since the apparatus was last worn, or over a predetermined time (e.g. over the last 3 months). This may help the user to determine whether a new exercise regime is effective, for example. The fitness information could be used by a personal trainer to devise or modify an exercise regime for the user. The indication of the health of the user may comprise information on their lung capacity or lung health, and/or whether their lung capacity/health has improved since the apparatus was last worn, or over a predetermined time (e.g. over the last 3 months). This information may help the user or a doctor/physician to determine if the user's lung health is improving or deteriorating following a respiratory illness, disease or surgery, or following the user quitting smoking (or switching from cigarettes to e-cigarettes). The indication of the health of the user may comprise information on whether the user's breathing has changed suddenly or unexpectedly (e.g. increase or decrease in breathing rate, or an increase or decrease in inhalation/exhalation depth or volume - e.g. deeper or shallower breaths). This may be useful in a hospital, as it may enable changes in the user's breathing to be identified and actioned early.
Thus, the remote processor may transmit data on the user's fitness (step S108). Alternatively, the remote processor may transmit a message to a hospital device 124 warning of the deteriorating health or condition of the user (step S110) if the user's breathing has changed suddenly.
Figure 3A shows example sensor data 300 sensed by the health monitoring system of Figure 1 after peak detection has been performed. The sensor data may be conductivity changes over time, but this may depend on the type of sensor 104 in the apparatus 102. The remote server may generate a breathing pattern by: identifying a plurality of inflection points (i.e. turning points on a continuous plane curve) in the sensor data, and classifying each inflection point as a local maximum 302 or a local minimum 304. This process may be called "peak detection". All of the inflection points in the data are detected/identified by determining if a data point N is greater or lesser in value than a data point that is immediately before data point N (i.e. N-l) and a data point that is immediately after data point N (i.e. N + l). In the case where data point N is greater than both data points N-l and N + l, the data point N is identified as an inflection point and is classified as a local maximum (or peak). In the case where data point N is lesser than both data points N-l and N + l, the data point N is identified as an inflection point and is classified as a local minimum (or trough). Each inflection point 302, 304 that has been identified and classified may be saved in storage 122. The remote server 120 may determine whether each identified inflection point 302, 304 is indicative of a breathing pattern or of noise. If an inflection point is indicative of noise, it needs to be removed or ignored in order to generate a breathing pattern that accurately reflects the user's breathing. For example, if a consecutive peak and trough have a low amplitude, they may represent noise rather than breathing data and therefore need to be ignored when generating a breathing pattern from the sensor data. In other words, peaks and troughs that have low prominence may be removed. This process may be called "peak prominence detection". Figure 3B shows the sensor data of Figure 3A after peak prominence detection has been performed. Thus, the remote server 120 may determine whether a distance 312 between two adjacent inflection points, where one of the two inflection points is classified as a local maximum and another of the two inflection points is classified as a local minimum (i.e. the distance between a peak and a trough), is above a threshold distance. If the distance 312 is below the threshold distance, the remote server may remove both of the two adjacent inflection points. The threshold distance may be any suitable value which indicates that the amplitude or distance between a peak and trough is not representative of a breath. For example, in cases where analogue sensor values (e.g. voltages) have been converted into digital values, the threshold distance may be 1000. More specifically, if a sensor's values range from 0V to 3.3V, the digital values may range from 0 to 65535 if a 16-bit ADC has been used for the conversion, and the threshold distance may be 1000. In other words, the distance between a successive peak and trough must be more than 1000. This may be a reasonable threshold distance, since in a normal breathing pattern successive peaks and troughs are usually separated by 10,000, and even if a user takes shallow breaths, the distance is more than 1000. Thus, after performing peak prominence detection on sensor data 300, the peaks and troughs in region 314 of the sensor data have been determined to not be representative of breathing. As shown in Figure 3B, the peaks and troughs in region 314 are no longer marked/tagged as inflection points, and so will not be used to generate a breathing pattern.
It will be understood that this an example, non-limiting threshold distance. It will be understood that the threshold distance may vary depending on, for example, the sensor and external/environmental conditions. Thus, more generally, the threshold distance value may be calculated based on an initial calibration of the sensor (to account for variability between sensors), and the modified/adjusted based on sensor data to account for environmental changes. That is, the threshold distance may be based on a calibration performed for each sensor, such that the threshold distance may vary for individual sensors.
The remote server may also look at whether the time between two breaths are too close together to be representative of a real breathing pattern. For example, if two peaks or two troughs are close together, they may not represent a real breathing pattern as the short time between the successive peaks/troughs mean there is not enough time for inhalation or exhalation. Thus, the remote server may determine whether a time between two successive inflection points each classified as a local maximum (i.e. two adjacent peaks), or between two successive inflection points each classified as a local minimum (i.e. two adjacent troughs), is less than a predetermined time. When the time is less than a predetermined time, one of the two inflection points may be removed by the remote server so that it is not used to generate the breathing pattern. This process may be called "peak separation analysis" or "peak distance analysis". Figure 3C shows the sensor data of Figure 3B after peak separation analysis has been performed. Thus, after performing the peak separation analysis on the data in Figure 3B, the two adjacent troughs or local minima 304a, 304b are considered to be too close together and not representative of a real breathing pattern. Consequently, as shown in Figure 3C, point 304b is no longer marked/tagged as an inflection point, and so will not be used to generate a breathing pattern.
As mentioned above, it is important to choose an appropriate predetermined time so as to not loose real breathing pattern data. For example, if a user is breathing rapidly or hyperventilating, then the user's breaths may be naturally close together. The predetermined time may be 0.6 seconds, which would equate to 100 breaths in a minute, or 0.7 seconds, which would equate to 86 breaths per minute. Such high breathing rates never occur in humans, and therefore, may be sufficient to remove peaks that are not representative of a real breathing pattern while also catching rapid breathing/hyperventilation. A breathing pattern should be an alternating sequence of peaks and troughs, i.e. an alternating sequence of inhalation and exhalation. Thus, if two adjacent inflection points are both classified as a local maximum or as a local minimum, such that there are two peaks next to each other without a trough in- between, or two troughs without a peak in-between, then the sensor data does not represent an alternating sequence of inhalation and exhalation. Accordingly, the remote server may generate a breathing pattern by identifying consecutive inflection points that are both classified as a local maximum or as a local minimum, and removing one of the two consecutive inflection points. For example, if the two consecutive inflection points are both classified as local maxima (peaks), the higher peak may be kept and the lower peak removed, whereas if the two consecutive inflection points are both classified as local minima (troughs), the larger trough may be kept, and the smaller trough removed. Figure 3D shows an example generated breathing pattern 350 after this process has been performed. In Figure 3C, two adjacent inflection points that were classified as local maxima 302a, 302b do not have a local minimum between them. Accordingly, as shown in Figure 3D, inflection point 302b is no longer marked/tagged as an inflection point. The resulting data 350 is the generated breathing pattern which shows a series of peaks and troughs that are representative of breathing (i.e. noise has been removed).
As mentioned above, once a breathing pattern has been generated by smoothing the original sensor data, one or more breathing characteristics may be derived or determined. A number of breathing characteristics may be determined, such as, for example:
• Inhalation Rate
• Inhalation Speed (e.g. the amplitude of a peak minus the amplitude of a neighbouring/consecutive trough, divided by inhalation time)
• Inhalation Time (e.g. the time between a consecutive peak and a trough in the breathing pattern)
• Exhalation Rate
• Exhalation Speed
• Exhalation Time
• Breaths per minute • Highest Breathing rate
• Lowest Breathing Rate
Other characteristics relating to either the user activity or the sensor itself may be determined from the original sensor data and/or the breathing pattern, and/or other data may be collected from the apparatus 102 or user device 110, such as, for example:
• Average breathing rate with respect to:
User activity level (via, for example, accelerometer 133)
Resistance training levels of a resistance training mask worn by the user (where the level may be obtained directly from the apparatus or via a user input on an app on the user device 110). For example, it may be determined that a user takes more breaths when using a training mask at one resistance level compared to when they use the training mask set at another resistance level.
• Signal depth - which may be indicative of shallow breathing, deep breaths, etc. One way to determine shallow or deep breaths based on signal depths is as follows: Since signal depth is sensor- and environment-dependent, after a period of sensor stabilisation (e.g. a minute), an average of the previous N breaths (e.g. 20 breaths) is taken. Each new breath is compared with this average. If the new breath depth is above or below the average by a certain percentage or amount, then this new breath may be classified as a deep or a shallow breath, respectively. This value is then added to the average, while the oldest breath is removed from the average, creating a moving window of N breaths. The next breath is then analysed in the same way, and the average modified accordingly. In the case where the depth is much higher or lower than the average value, then the depth value in this case would not be included in the average to ensure the average is not biased by such outliers.
• Exertion score - e.g. how hard is the user exercising or breathing based on the generated breathing pattern (and relative to e.g. historical breathing patterns/characteristics collected during past exercise sessions)
• Hours trained - e.g. hours of exercise while wearing a resistance training mask, based on the sensor data sensing breathing
• Sensor lifetime, which may be determined by:
Total number of breaths taken/ recorded while the apparatus 102 is used. This may also correlate with having to change filters in masks used for pollution protection and air filtration to maintain optimal air filtration.
External factors
Inactive time periods
• Inhalation/exhalation intervals
• Breathing power, which may be measured by the change in inhalation velocity over time
• Gradient of breathing pattern
Exertion Score
The Exertion Score (ES) may be used to measure the intensity of a user's exercise activity. There are a number of techniques that may be used for calculating or determining an Exertion Score for a user. Some techniques may enable a real-time or live Exertion Score to be determined using real-time data, while other techniques may enable an Exertion Score to be determined after all the data from a particular time period (e.g. from a particular training or exercise session) has been recorded.
The Exertion Score may be a numerical score or value ranging from 0 to 10. The table below provides example scores and their meaning:
Figure imgf000031_0001
Figure imgf000032_0001
Table 1
It will be understood that this is merely an illustrative, non-limiting example, and other ways of assigning a score to the user's exercise intensity.
Four example techniques for calculating the exertion score are provided in the table below. Once again, it will be understood that these are merely provided as examples for illustrative purposes, and are non-limiting.
Figure imgf000032_0002
Table 2
The first example technique in Table 2 above for generating an exertion score can be performed in real-time, i.e. synchronously with the sensor data collection. The first example technique determines the exertion score (ES) by multiplying the breaths per minute (BPM), as determined from the sensor data, with a rate of increase of the signal from the sensor when a user exhales on the sensor (OutSpeed). This is multiplied by a constant (Scaling) to generate an exertion score that is in a range between, for example, 0 to 100 or 0 to 10 (as in Table 1 above).
The second example technique in Table 2 above for generating an exertion score can be performed both synchronously and asynchronously. The second example is an extension of the first example technique in the table above. Specifically, the second technique determines the exertion score (ES) by applying to the breaths per minute (BPM) as determined from the sensor data, a function f(BPM) to generate an exertion score that is in a range between 0 to 10. Figures 4A and 4B show, respectively, data used to determine a technique for generating an exertion score from sensor data in real-time (synchronously) or asynchronously. Specifically, Figure 4A shows breaths per minute against exertion score, as the f(BPM) scales the BPM between 0 and 10. For example, a breaths per minute value of 20 corresponds to an exertion score of 1 (little exertion), while a breaths per minute value of 50 corresponds to an exertion score of 8. From this, various models were trialled to determine which model best fit the data in the graph. A Gaussian model was chosen and thus, the constants in Table 2 above (for technique 2) were obtained by fitting the Gaussian model to the data. The function is termed f(BPM).
A similar method was used to generate an equation modelling the asynchronous (offline) data. A difference relative to the synchronous data modelling is that here, the average exertion score of the whole activity can be used to calculate f(BPM), and further scaling may be applied depending on the overall time spent in this activity using f(duration). However, in Figure 4B, the x- axis relates to duration (minutes) and the y-axis relates to a scaling element, because the offline analysis is based on the total duration of the exercise or activity. This function is termed f(duration). The duration of the exercise/activity applies a scaling to the previously calculated exertion score using f(BPM) with the average BPM for the whole recording, and with a final equation providing a duration-scaled exertion score (given by f(BPM_average) * f(duration)). For example, if a user has completed a workout and their overall average BPM is 45, using f(BPM_average), as shown in Figure 4A, this would correspond to an exertion score of 6. If the total duration of this workout was 60 minutes, then a scaling of 60 minutes would be applied using f(duration = 60), which is 1.6, as shown in Figure 4B, and therefore their exertion score would be 6*1.2 = 7.2. Shorter workouts will have the exertion score scaled down by f(duration) since a short workout at a given intensity is easier than a longer workout at the same intensity. The graph in Figure 4B was modelled as a third-degree polynomial and the constants shown in Table 2 are above are obtained by fitting the third-degree polynomial to the data.
The third technique in Table 2 above generates an exertion score by normalising breaths per minute data obtained throughout a workout using the highest BPM value. The normalised value is then inputted into the equation shown in the table above to calculate the exertion score. This process is described by Nicolo et al in "Respiratory Frequency during Exercise: The Neglected Physiological Measure" (Frontiers in Physiology, December 2017, Vol. 8, Article 922).
The fourth technique in Table 2 above can be used to generate an exertion score from sensor data asynchronously. Figure 4C shows data relating to this technique. Specifically, the top graph shows a distribution plot of how much time a user spends at each BPM. A reason why high BPMs of 70 and 80 appear on the distribution plot is because a window of 10 seconds is being used, and a person can breathe at what would be 70BPM for 10 seconds easily, but maintaining this rate for 60 seconds or more would be much harder. This means the scaling function, which is a function of duration at each BPM is also a function of the window size for which BPMs have been calculated. The scaling coefficients may be calculated manually, or preferably, may be calculated using a machine learning model which has been trained on breathing data from many people who also provide their perceived exertion score.
The BPM distribution is then scaled, where higher BPMs are scaled-up and lower BPMs are scaled-down (as described above). The distribution profile is replotted, as shown in Figure 4C (bottom figures). The scaling function may be called f(BPM_duration, windowSize). The area under the scaled distribution profile is then calculated as the exertion score. This area is determined by calculating the sum(f(BPM_duration, windowSize)*BPM_duration). For intense workouts, the area may have values reaching 90-100 (which is then scaled by a further 1/10 to be in the range of 0 to 10 as per Table 1 above), while for non-workouts the exertion score may be below 0 due to the scaling (and negative scaling).
Negative scaling means that a user's resting periods during a workout, for example at around 10 to 15 BPMs for some people, would be subtracted from the total exertion score. This means if someone rests too long during exercise the exertion score will be lowered. If the overall exertion score is negative this may be capped to zero. This may occur if an activity was recorded while doing a simple walk or sitting at the desk.
Figure 5 shows a flowchart of example steps to determine an exertion score. The process may begin by receiving breathing characteristic data after an exercise session has been completed (step S500). A distribution profile may be calculated using the breathing characteristic data (step S502). An algorithm or technique, such as one described above, may be used to determine a scaling factor (step S504). The scaling factor may be used to scale the distribution profile, as described above and the scaled distribution profile may be used to determine an exertion score (S506). The exertion score may be transmitted to the user or to a clinician or personal trainer, for example.
Those skilled in the art will appreciate that while the foregoing has described what is considered to be the best mode and where appropriate other modes of performing present techniques, the present techniques should not be limited to the specific configurations and methods disclosed in this description of the preferred embodiment. Those skilled in the art will recognise that present techniques have a broad range of applications, and that the embodiments may take a wide range of modifications without departing from any inventive concept as defined in the appended claims.

Claims

1. A health monitoring system comprising: an apparatus comprising: a sensor for sensing breathing of a user using the apparatus, and a communication module for transmitting sensor data; and at least one remote processor for: receiving sensor data from the apparatus; and smoothing the received sensor data to generate a breathing pattern.
2. The health monitoring system as claimed in claim 1 wherein the at least one remote processor smooths the received sensor data by: identifying a plurality of inflection points in the sensor data; classifying each inflection point as a local maximum or a local minimum; determining whether a distance between two adjacent inflection points, where one of the two inflection points is classified as a local maximum and another of the two inflection points is classified as a local minimum, is above a threshold distance; and removing both of the two adjacent inflection points where the distance is below the threshold distance.
3. The health monitoring system as claimed in claim 2 wherein the at least one remote processor smooths the received sensor data by: determining whether a time between two successive inflection points classified as a local maximum, or between two successive inflection points each classified as a local minimum, is less than a predetermined time; and removing, when the time is less than a predetermined time, one of the two inflection points.
4. The health monitoring system as claimed in claim 2 or 3 wherein the at least one remote processor smooths the received sensor data by: identifying consecutive inflection points that are both classified as a local maximum or as a local minimum; and removing one of the two consecutive inflection points.
5. The health monitoring system as claimed in any preceding claim wherein the at least one remote processor is further configured to determine, from the breathing pattern, at least one breathing characteristic.
6. The health monitoring system as claimed in claim 5 wherein the breathing characteristic is inhalation speed, and the at least one remote processor determines the inhalation speed from the breathing pattern by: measuring a time between an inflection point classified as a local maximum and a subsequent inflection point classified as a local minimum; and dividing a distance between the inflection point classified as a local maximum and the subsequent inflection point classified as a local minimum by the measured time.
7. The health monitoring system as claimed in claim 5 or 6 wherein the breathing characteristic is exhalation speed, and the at least one remote processor determines the exhalation speed from the breathing pattern by: measuring a time between an inflection point classified as a local minimum and a subsequent inflection point classified as a local maximum; and dividing a distance between the inflection point classified as a local minimum and the subsequent inflection point classified as a local maximum by the measured time.
8. The health monitoring system as claimed in claim 7 wherein the breathing characteristic is a ratio of inhalation to exhalation, and the at least one remote processor determines the ratio from the breathing pattern by: dividing the inhalation speed by the exhalation speed; or dividing the inhalation time by the exhalation time.
9. The health monitoring system as claimed in any one of claims 5 to 8 wherein the breathing characteristic is a breathing rate, and the at least one remote processor determines the breathing rate from the breathing pattern by: determining the number of inflection points classified as a local maximum in a minute.
10. The health monitoring system as claimed in claim 9 wherein determining the number of inflection points classified as a local maximum in a minute comprises: determining, when the breathing pattern is longer than a minute, an average number of inflection points classified as a local maximum in a minute.
11. The health monitoring system as claimed in claim 9 wherein determining the number of inflection points classified as a local maximum in a minute comprises: extrapolating, when the breathing pattern is less than a minute, the breathing rate based on the number of inflection points in a duration of the breathing pattern.
12. The health monitoring system as claimed in any one of claims 5 to 11 wherein the breathing characteristic is inhalation depth, and the at least one remote processor determines the inhalation depth from the breathing pattern by: averaging a distance between an inflection point classified as a local maximum and a subsequent inflection point classified as a local minimum.
13. The health monitoring system as claimed in any one of claims 5 to 12 wherein the breathing characteristic is exhalation depth, and the at least one remote processor determines the exhalation depth from the breathing pattern by: averaging a distance between an inflection point classified as a local minimum and a subsequent inflection point classified as a local maximum.
14. The health monitoring system as claimed in any preceding claim further comprising at least one user interface for displaying information on one or more of: a total number of hours the user has worn the apparatus, an exertion score, an indication of the user's lung function, and information on whether the sensor needs to be replaced.
15. The health monitoring system as claimed in any preceding claim wherein the at least one remote processor is arranged to: compare sensor data received from the apparatus over a predetermined time period; and determine whether the accuracy of the sensor has changed over the predetermined time period.
16. The health monitoring system as claimed in any preceding claim wherein the apparatus further comprises: an accelerometer to sense movement of the user while the user is using the apparatus.
17. The health monitoring system as claimed in claim 16 wherein the at least one remote processor uses data from the accelerometer to smooth the sensor data to generate the breathing pattern.
18. The health monitoring system as claimed in any preceding claim wherein the at least one remote processor uses additional input data to smooth the sensor data to generate the breathing pattern, where the additional input data comprises one or more of: geographical location of the user, altitude data, weather data, humidity data, air quality index, pollution data, pollen data, and oxygen level.
19. The health monitoring system as claimed in claim 18 wherein the at least one remote processor uses the humidity data as a baseline humidity measure when smoothing the sensor data to generate the breathing pattern.
20. The health monitoring system as claimed in any preceding claim wherein the sensor is any one of: a thermistor, a humidity sensor, a gas sensor, a pressure sensor, a microphone, a sound sensor or detector, and a sensor comprising a porous material.
21. The health monitoring system as claimed in any preceding claim wherein the apparatus is any one of: a wearable apparatus, a resistive sports mask, an oxygen deprivation mask, an apparatus worn over a user's mouth and/or nose, a medical breath monitoring apparatus, a face mask, a disposable face mask, a personal protection equipment face mask, a surgical mask, an oxygen mask, an inhaler, an asthma inhaler, an e-cigarette, a heat moisture exchanger, and a nasal cannula.
22. The health monitoring system as claimed in any of claims 1 to 21 wherein the sensor and communication module are removably attached to the apparatus.
23. The health monitoring system as claimed in any of claims 1 to 21 wherein the sensor and communication module are irremovably attached to the apparatus.
24. The health monitoring system as claimed in any one of claims 5 to 23 wherein the remote processor: determines an indication of the health of the user from the at least one breathing characteristic; and wherein the system further comprises: at least one user interface for displaying the indication of the health of the user.
25. The health monitoring system as claimed in any one of claims 5 to 23 wherein the remote processor: determines an indication of the health of the user from the at least one breathing characteristic; and transmits the indication of the health of the user to any one or more of: a user device, a third party device, and a third party server.
26. The health monitoring system as claimed in any preceding claim, wherein the remote processor is configured to: transmit the received sensor data to a third party device or third party server.
27. The health monitoring system as claimed in any preceding claim, wherein the remote processor is configured to: transmit the generated breathing pattern to a third party device or third party server.
28. The health monitoring system as claimed in any of claims 25 to 27, wherein the transmission is in real-time.
29. The health monitoring system as claimed in any preceding claim wherein the remote processor is configured to: receive sensor data from a third party server for analysis; smooth the received sensor data from the third party server to generate a breathing pattern; and transmit the generated breathing pattern to the third party server.
30. The health monitoring system as claimed in any preceding claim wherein the at least one remote processor performs the smoothing of the sensor data in real-time or in near real-time.
31. A method for health monitoring, comprising: receiving sensor data from an apparatus comprising a sensor for sensing breathing of a user using the apparatus; and smoothing the received sensor data to generate a breathing pattern.
32. The method as claimed in claim 31 wherein smoothing the received sensor data comprises: identifying a plurality of inflection points in the sensor data; classifying each inflection point as a local maximum or a local minimum; determining whether a distance between two adjacent inflection points, where one of the two inflection points is classified as a local maximum and another of the two inflection points is classified as a local minimum, is above a threshold distance; and removing both of the two adjacent inflection points where the distance is below the threshold distance.
33. The method as claimed in claim 32 wherein smoothing the received sensor data comprises: determining whether a time between two successive inflection points classified as a local maximum, or between two successive inflection points each classified as a local minimum, is less than a predetermined time; and removing, when the time is less than a predetermined time, one of the two inflection points.
34. The method as claimed in claim 32 or 33 wherein smoothing the received sensor data comprises: identifying consecutive inflection points that are both classified as a local maximum or as a local minimum; and removing one of the two consecutive inflection points.
35. The method as claimed in any of claims 31 to 34 further comprising determining, from the breathing pattern, at least one breathing characteristic.
36. The method as claimed in claim 35 wherein the breathing characteristic is inhalation speed, and the method comprises determining the inhalation speed from the breathing pattern by: measuring a time between an inflection point classified as a local maximum and a subsequent inflection point classified as a local minimum; and dividing a distance between the inflection point classified as a local maximum and the subsequent inflection point classified as a local minimum by the measured time.
37. The method as claimed in claim 35 or 36 wherein the breathing characteristic is exhalation speed, and the method comprises determining the exhalation speed from the breathing pattern by: measuring a time between an inflection point classified as a local minimum and a subsequent inflection point classified as a local maximum; and dividing a distance between the inflection point classified as a local minimum and the subsequent inflection point classified as a local maximum by the measured time.
38. The method as claimed in claim 37 wherein the breathing characteristic is a ratio of inhalation to exhalation, and the method comprises determining the ratio from the breathing pattern by: dividing the inhalation speed by the exhalation speed.
39. The method as claimed in any one of claims 35 to 38 wherein the breathing characteristic is a breathing rate, and the method comprises determining the breathing rate from the breathing pattern by: determining the number of inflection points classified as a local maximum in a minute.
40. The method as claimed in claim 39 wherein determining the number of inflection points classified as a local maximum in a minute comprises: determining, when the breathing pattern is longer than a minute, an average number of inflection points classified as a local maximum in a minute.
41. The method as claimed in claim 39 wherein determining the number of inflection points classified as a local maximum in a minute comprises: extrapolating, when the breathing pattern is less than a minute, the breathing rate based on the number of inflection points in a duration of the breathing pattern.
42. The method as claimed in any one of claims 35 to 41 wherein the breathing characteristic is inhalation depth, and the method comprises determining the inhalation depth from the breathing pattern by: averaging a distance between an inflection point classified as a local maximum and a subsequent inflection point classified as a local minimum.
43. The method as claimed in any of claims 35 to 42 wherein the breathing characteristic is exhalation depth, and the method comprises determining the exhalation depth from the breathing pattern by: averaging a distance between an inflection point classified as a local minimum and a subsequent inflection point classified as a local maximum.
44. The method as claimed in any of claims 31 to 43 further comprising: comparing sensor data received from the apparatus over a predetermined time period; and determining whether the accuracy of the sensor has changed over the predetermined time period.
45. The method as claimed in any of claims 31 to 45 further comprising: receiving accelerometer data corresponding to movement of a user wearing the apparatus.
46. The method as claimed in claim 45 further comprising using the accelerometer data to smooth the sensor data to generate the breathing pattern.
47. The method as claimed in any of claims 31 to 46 further comprising using additional input data to smooth the sensor data to generate the breathing pattern, where the additional input data comprises one or more of: geographical location of the user, altitude data, weather data, humidity data, air quality index, pollution data, pollen data, and oxygen level.
48. The method as claimed in claim 47 further comprising using the humidity data as a baseline humidity measure when smoothing the sensor data to generate the breathing pattern.
49. The method as claimed in any one of claims 35 to 48 further comprising: determining an indication of the health of the user from the at least one breathing characteristic.
50. The method as claimed in claim 49 further comprising: transmitting the indication of the health of the user to any one or more of: a user device, a third party device, and a third party server.
51. The method as claimed in any of claims 31 to 50, further comprising: transmitting the received sensor data to a third party device or third party server.
52. The method as claimed in any of claims 31 to 51 further comprising: transmitting the generated breathing pattern to a third party device or third party server.
53. The method as claimed in any of claims 50 to 52 wherein the transmitting is in real-time.
54. The method as claimed in any of claims 31 to 53 wherein receiving sensor data from an apparatus comprises receiving the sensor data via a third party server.
55. The method as claimed in any of claims 31 to 54 wherein the smoothing of the sensor data is performed in real-time or in near real-time.
56. The method as claimed in any of claims 31 to 55 further comprising: providing, to a user device, breathing exercises for the user to follow while using the apparatus; and analysing, using the received sensor data, user performance while undertaking the breathing exercises.
57. The method as claimed in any of claims 35 to 56 further comprising determining an exertion score by: calculating, using the breathing characteristic, a distribution profile of the breathing characteristic; determining a scaling factor; scaling the distribution profile using the scaling factor to generate a scaled distribution profile; and determining the exertion score using the scaled distribution profile.
58. The method as claimed in claim 57 wherein the breathing characteristic is breathing rate.
59. The method as claimed in claim 35 wherein the breathing characteristic is a breath depth, and the method comprises determining the breath depth from the breathing pattern by: determining an average breath depth for a predetermined number of breaths; comparing a depth of a subsequent breath to the average breath depth; and classifying the subsequent breath as a shallow breath when the depth of the subsequent breath is below the average breath depth by a threshold value, or as a deep breath when the depth of the subsequent is above the average breath depth by a threshold value.
60. The method as claimed in any of claims 31 to 59 further comprising providing the user, using a total number of breaths recorded using the apparatus, with an indication of when to replace the sensor of the apparatus.
61. A computer readable medium carrying processor control code which when implemented in a system causes the system to carry out the method of any of claims 31 to 60.
PCT/GB2020/052112 2019-09-05 2020-09-04 Systems and methods for analysing breathing Ceased WO2021044150A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB1912815.6 2019-09-05
GB1912815.6A GB2586848A (en) 2019-09-05 2019-09-05 Systems and methods for analysing breathing

Publications (1)

Publication Number Publication Date
WO2021044150A1 true WO2021044150A1 (en) 2021-03-11

Family

ID=68241188

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2020/052112 Ceased WO2021044150A1 (en) 2019-09-05 2020-09-04 Systems and methods for analysing breathing

Country Status (2)

Country Link
GB (1) GB2586848A (en)
WO (1) WO2021044150A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022090702A2 (en) 2020-10-26 2022-05-05 Spyras Ltd Apparatus for sensing and analysing breathing
US20230199348A1 (en) * 2021-12-16 2023-06-22 3M Innovative Properties Company System and computer-implemented method for providing responder information
WO2023114494A1 (en) * 2021-12-16 2023-06-22 Breezee, Inc. Device and methods for monitoring and training breathing
CN118588224A (en) * 2024-08-06 2024-09-03 浙江微红健康科技有限公司 Health data monitoring system, data collection device and equipment based on big data

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112120701A (en) * 2020-09-17 2020-12-25 江苏集萃有机光电技术研究所有限公司 Breathing monitoring mask and breathing monitoring method

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0956820A1 (en) 1996-10-04 1999-11-17 Karmel Medical Acoustic Technologies Ltd. Apnea determination
US20050241639A1 (en) * 2003-01-30 2005-11-03 Compumedics, Inc. Algorithm for automatic positive air pressure titration
WO2012007719A1 (en) 2010-07-14 2012-01-19 Imperial Innovations Limited Feature characterization for breathing monitor
US20130079656A1 (en) 2011-09-23 2013-03-28 Nellcor Puritan Bennett Ireland Systems and methods for determining respiration information from a photoplethysmograph
US20130331723A1 (en) * 2011-02-22 2013-12-12 Miguel Hernandez-Silveira Respiration monitoring method and system
WO2014128090A1 (en) 2013-02-20 2014-08-28 Pmd Device Solutions Limited A method and device for respiratory monitoring
WO2016065180A1 (en) 2014-10-22 2016-04-28 President And Fellows Of Harvard College Detecting gases and respiration by the conductivity of water within a porous substrate sensor
GB2550833A (en) 2016-02-26 2017-12-06 Pneumacare Ltd Breath identification and matching
US20180317846A1 (en) * 2017-05-08 2018-11-08 Intel Corporation Respiratory biological sensing
JP2019141597A (en) 2019-03-05 2019-08-29 パイオニア株式会社 Signal processing device and method, computer program, and recording medium

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0956820A1 (en) 1996-10-04 1999-11-17 Karmel Medical Acoustic Technologies Ltd. Apnea determination
US20050241639A1 (en) * 2003-01-30 2005-11-03 Compumedics, Inc. Algorithm for automatic positive air pressure titration
WO2012007719A1 (en) 2010-07-14 2012-01-19 Imperial Innovations Limited Feature characterization for breathing monitor
US20130331723A1 (en) * 2011-02-22 2013-12-12 Miguel Hernandez-Silveira Respiration monitoring method and system
US20130079656A1 (en) 2011-09-23 2013-03-28 Nellcor Puritan Bennett Ireland Systems and methods for determining respiration information from a photoplethysmograph
WO2014128090A1 (en) 2013-02-20 2014-08-28 Pmd Device Solutions Limited A method and device for respiratory monitoring
WO2016065180A1 (en) 2014-10-22 2016-04-28 President And Fellows Of Harvard College Detecting gases and respiration by the conductivity of water within a porous substrate sensor
US10712337B2 (en) 2014-10-22 2020-07-14 President And Fellows Of Harvard College Detecting gases and respiration by the conductivity of water within a porous substrate sensor
GB2550833A (en) 2016-02-26 2017-12-06 Pneumacare Ltd Breath identification and matching
US20180317846A1 (en) * 2017-05-08 2018-11-08 Intel Corporation Respiratory biological sensing
JP2019141597A (en) 2019-03-05 2019-08-29 パイオニア株式会社 Signal processing device and method, computer program, and recording medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
NICOLO ET AL.: "Respiratory Frequency during Exercise: The Neglected Physiological Measure", FRONTIERS IN PHYSIOLOGY, vol. 8, December 2017 (2017-12-01)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022090702A2 (en) 2020-10-26 2022-05-05 Spyras Ltd Apparatus for sensing and analysing breathing
US20230199348A1 (en) * 2021-12-16 2023-06-22 3M Innovative Properties Company System and computer-implemented method for providing responder information
WO2023114494A1 (en) * 2021-12-16 2023-06-22 Breezee, Inc. Device and methods for monitoring and training breathing
US11722807B2 (en) * 2021-12-16 2023-08-08 3M Innovative Properties Company System and computer-implemented method for providing responder information
US11962958B2 (en) 2021-12-16 2024-04-16 3M Innovative Properties Company System and computer-implemented method for providing responder information
CN118588224A (en) * 2024-08-06 2024-09-03 浙江微红健康科技有限公司 Health data monitoring system, data collection device and equipment based on big data

Also Published As

Publication number Publication date
GB201912815D0 (en) 2019-10-23
GB2586848A (en) 2021-03-10

Similar Documents

Publication Publication Date Title
WO2021044150A1 (en) Systems and methods for analysing breathing
JP6200430B2 (en) Method and apparatus for monitoring and controlling pressure assist devices
CN105283127B (en) Systems and methods for monitoring respiration
CA2874447A1 (en) Spirometer system and methods of data analysis
CA2847412C (en) System and methods for estimating respiratory airflow
JP2005537068A5 (en)
US20140155774A1 (en) Non-invasively determining respiration rate using pressure sensors
JP6665179B2 (en) Method and device for determining the health of a subject
CN108366756B (en) Apparatus, system and method for determining breathing characteristics of a subject based on breathing gas
US20230000388A1 (en) Oxygen mask respirometer
Basra et al. Temperature sensor based ultra low cost respiration monitoring system
EP4069077B1 (en) Systems and methods for metabolic monitoring
JP7702552B2 (en) Method and apparatus for measuring respiration - Patents.com
KR101696791B1 (en) Pulmonary function test apparatus using chest impedance and thereof method
JP6315576B2 (en) Sleep breathing sound analysis apparatus and method
CN116110585A (en) Respiratory rehabilitation evaluation system for chronic obstructive pneumonia
Nesar et al. Improving touchless respiratory monitoring via lidar orientation and thermal imaging
CN107205672B (en) Apparatus and method for evaluating respiratory data of a monitored subject
JP7109534B2 (en) Spirometer Flow Sensing Arrangement
CN114431853A (en) Portable metabolic energy examination equipment
CN114391809A (en) Non-reporting type olfactory function checking equipment and checking method thereof
JP6552158B2 (en) Analysis device, analysis method, and program
CN109316189B (en) A kind of non-contact respiratory dynamic detection method and device
JP2022122975A (en) Biological monitoring system and its program
Jin et al. Vtmonitor: Tidal volume estimation using earbuds

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20768669

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20768669

Country of ref document: EP

Kind code of ref document: A1