WO2016051379A1 - User state classification - Google Patents
User state classification Download PDFInfo
- Publication number
- WO2016051379A1 WO2016051379A1 PCT/IB2015/057532 IB2015057532W WO2016051379A1 WO 2016051379 A1 WO2016051379 A1 WO 2016051379A1 IB 2015057532 W IB2015057532 W IB 2015057532W WO 2016051379 A1 WO2016051379 A1 WO 2016051379A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- vital sign
- heart rate
- information
- sign information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording for evaluating the cardiovascular system, e.g. pulse, heart rate, blood pressure or blood flow
- A61B5/024—Measuring pulse rate or heart rate
- A61B5/02416—Measuring pulse rate or heart rate using photoplethysmograph signals, e.g. generated by infrared radiation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
- A61B5/1118—Determining activity level
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2562/00—Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
- A61B2562/02—Details of sensors specially adapted for in-vivo measurements
- A61B2562/0219—Inertial sensors, e.g. accelerometers, gyroscopes, tilt switches
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7264—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
Definitions
- This invention generally relates to methods and devices for classifying a user's state and, in particular, classifying a user's state by analyzing the user's vital sign information in conjunction with the user's movement.
- a sedentary state is an ubiquitous part of present day lifestyles. Many people perform activities that typically require very low energy in a sitting or prone position. These activities may include reading, watching television, driving a vehicle, and doing office work.
- Wearable technologies have become an increasingly common way to monitor one's health.
- these devices are mainly used to monitor the wearer when engaged in moderate to vigorous intensity exercise and may also provide inaccurate feedback.
- these types of devices may confuse wrist movements with exercise, even though the wearer may only be performing office work.
- Embodiments of the present invention relate to methods and apparatus for determining when the user is in a sedentary state. Making such determinations using, e.g., motion alone is difficult, as a wrist- worn accelerometer may make it seem that a user is active when they may just be performing desk work. Accordingly, embodiments of the present invention address this deficiency considering a source of vital sign information in tandem with motion data. By analyzing information related to a user's vital sign(s) and the user's movement, the apparatus can more accurately determine when a user is active or in a sedentary state. This information may be communicated to medical personnel, or directly to the user so the user may be aware of their sedentary state and be incited to perform a physical activity.
- embodiments of the present invention relate to an apparatus for classifying user state having a source of user vital sign information, a source of user movement information, and a detection module that is configured to analyze the vital sign information in conjunction with the user movement information to determine whether the user is in a sedentary state.
- the vital sign information is the user's heart rate.
- the vital sign information includes the mean (average) heart rate of the user.
- the vital sign information includes the difference between the mean heart rate of the user and the resting heart rate of the user.
- the vital sign information includes an estimate of the resting heart rate of the user based on the user's heart rate.
- the source of the user vital sign information includes a PPG sensor.
- the source of user movement information includes an accelerometer that is operably connected to at least one portion of the user.
- the user vital sign information is one or more of the user's heart rate, respiratory rate, etc.
- the apparatus further includes an alert device in operable communication with the detection module for issuing an alert that the user is in a sedentary state.
- This alert may be issued when the user is in the sedentary state for a predetermined period of time, for example.
- the source of vital sign information includes a vital sign feature estimator for processing the vital sign information before analysis by the detection module.
- the source of user movement information may include an activity feature estimator for processing the movement information before analysis by the detection module.
- embodiments of the present invention relate to a method of classifying a state of a user.
- the method includes obtaining information that is generally related to at least one vital sign of the user, obtaining information that is generally related to movement of at least one portion of the user, analyzing the vital sign information in conjunction with the user movement information to determine whether the user is in a sedentary state, and issuing an alert when it is determined that the user is in a sedentary state.
- the vital sign information is the user's heart rate. In some embodiments, the vital sign information includes the mean (average) heart rate of the user. In some embodiments, the vital sign information includes the difference between the mean heart rate of the user and a resting heart rate of the user. In some embodiments, the vital sign information includes an estimate of a resting heart rate of the user based on the user's heart rate.
- the vital sign information is obtained from a PPG sensor.
- the user movement information is obtained from an accelerometer operably connected to at least one portion of the user.
- the vital sign information is one or more of the user's heart rate, respiratory rate, body temperature, and blood pressure.
- embodiments of the present invention relate to an apparatus for classifying an activity of a user having a first device for obtaining information that is generally related to the user's heart rate, a heart rate feature estimator for estimating at least one heart rate feature based on the heart rate information, a second device for obtaining information that is generally related to movement of at least one portion of the user, an activity feature estimator for estimating at least one activity feature based on the movement information, and a sedentary state detection module to analyze the heart rate feature in conjunction with the activity feature to determine whether the user is in a sedentary state.
- FIG. 1 generally illustrates movement data and heart rate data of a user in accordance with one embodiment of the invention
- FIG. 2 generally illustrates a user wearing a vital sign sensor device and acceleration sensor devices in accordance with one embodiment of the invention
- FIG. 3 schematically illustrates a block diagram of an apparatus in accordance with one embodiment of the invention
- FIG. 4 schematically illustrates an embodiment of the apparatus and a user in accordance with one embodiment of the invention.
- FIG. 5 generally illustrates a flow chart of a method of classifying the activity of a user in accordance with one embodiment of the invention.
- the present disclosure also relates to an apparatus for performing the operations herein.
- This apparatus may be specially constructed for the required purposes, or it may comprise a general-purpose computer selectively activated or reconfigured by a computer program stored in the computer.
- a computer program may be stored in a computer readable storage medium, such as, but is not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, magnetic-optical disks, read-only memories (ROMs), random access memories (RAMs), EPROMs, EEPROMs, magnetic or optical cards, application specific integrated circuits (ASICs), or any type of media suitable for storing electronic instructions, and each may be coupled to a computer system bus.
- the computers referred to in the specification may include a single processor or may be architectures employing multiple processor designs for increased computing capability.
- FIG. 1 generally shows graphical illustrations of a user's movement information 110 and the user's heart rate information 120 in accordance with one embodiment of the invention.
- the movement information may be obtained from one or more accelerometers placed on at least one portion of the user's body (e.g., the user's wrist(s)).
- the user's heart rate information is in beats per minute (bpm) and may be obtained from a heart rate sensor device such as a photoplethysmogram (PPG) sensor or an electrocardiogram (ECG) sensor.
- PPG photoplethysmogram
- ECG electrocardiogram
- the graphical illustrations 110 and 120 show information regarding the user's acceleration and heart rate in substantially real time over the same time period. As indicated on both illustrations 110 and 120, the user starts in a non-sedentary state, for example, sprinting, at roughly the 525 second mark (indicated by the "activity start” label). The performance of this activity is accompanied by a larger amount of acceleration and deceleration (as shown in illustration 110) and an increase in the user's heart rate (as shown in illustration 120).
- the user performs this activity for roughly 50 seconds.
- the amount of acceleration and the user's heart rate decrease to pre-activity levels. That is, substantially no acceleration and a decrease in heart rate to levels that suggest the user is in a sedentary state.
- embodiments of the present invention can more accurately determine when the user is in a sedentary state (or a non-sedentary state). If, on the other hand, only acceleration information was considered, it may appear the user is in a non-sedentary state when the user may simply be traveling in an automobile, for example. Alternatively, if only heart rate information was considered, it may appear the user is in a non-sedentary state when the user may simply be experiencing stress at work, having an argument, or watching a suspenseful movie (all typically sedentary activities).
- FIG. 2 generally illustrates a user 200 wearing sensor devices in accordance with one embodiment of the invention.
- the user 200 is wearing accelerometers 202 on his wrists, as well as a heart rate sensor 204 on his chest.
- the positions of the accelerometers 202 on the user 200 may of course vary as long as the accelerometers 202 can accurately obtain information regarding the user's movement.
- the position of the heart rate sensor 204 may vary as long as the heart rate sensor 204 can accurately obtain information regarding the user's heart rate.
- the accelerometer(s) 202, the heart rate sensor 204, and the related computing hardware performing user state classification are all contained in a single device, such as a wrist- worn monitor.
- the apparatus 300 includes a source of vital sign information 310, a source of movement information 312, and a detection module 314.
- the detection module 314 may also be configured to issue an alert regarding the user's activity to the user and/or medical personnel and/or other parties, as discussed below.
- the source of vital sign information 310 may include one or more vital sign sensors 316 and a vital sign feature estimator 318.
- the vital sign sensors 316 may include not only heart rate sensors, but also sensors to obtain information related to the user's respiratory rate, body temperature, blood pressure, and other physiological measures. Like heart rate, these vital signs may elevate when the user in a non-sedentary state and then decrease when the user is in a sedentary state.
- the raw data obtained by the vital sign sensor(s) 316 may be communicated to the vital sign feature estimator 318.
- the vital sign feature estimator 318 may derive at least one vital sign feature based on the raw data regarding the vital sign(s). For example, the vital sign feature estimator 318 may calculate various derivatives of the measured physiological data, including but not limited to simple averages (i.e., a mean(s)), weighted averages, standard deviations, etc.
- the feature estimator 318 may be implemented as a single pole filter, a multipole filter, a Kalman filter, etc.
- the vital sign feature estimator 318 may take into account resting or baseline values for the various vital signs measured.
- a user may, for example, input their resting heart rate into the apparatus (e.g., via a mobile device such as a smartphone or the like) and the vital sign feature estimator may take this value into account when estimating the vital sign feature.
- the apparatus may then subtract the user's resting heart rate from the user's mean heart rate over a certain period of time to more estimate the vital sign feature.
- the vital sign feature estimator 318 may also estimate the user's resting heart rate based on, e.g., the information obtained by the vital sign sensors 316 during a sedentary period to more accurately estimate the vital sign feature.
- the source of movement information 312 may include one or more movement sensors 320 and an activity feature estimator 322.
- the movement sensors 320 may be accelerometers positioned on various portions of the user's body. For example, there may be an accelerometer positioned on each of the user's wrists (as illustrated in Fig. 2) as well as on the user's legs. The positions of the movement sensor(s) 320 may of course vary as long as they can obtain accurate information regarding the user's movement.
- Other suitable movement sensors include, but are not limited to magnetometers, gyroscopes, a combination of movement sensors, etc.
- the raw data obtained by the movement sensor(s) 320 may be communicated to the activity feature estimator 322.
- the activity feature estimator 322 may derive at least one activity feature based on the raw data regarding the user's movement. For example, the activity feature estimator 322 may calculate various derivatives of the measured movement data, including but not limited to simple averages (i.e., a mean(s)), weighted averages, standard deviations, etc.
- the activity feature estimator 322 may be implemented as a single pole filter, a multipole filter, a Kalman filter, etc.
- the vital sign information and the activity information may be communicated to the detection module 314.
- the detection module 314 may analyze the vital sign information in conjunction with the activity information to classify the user's activity (e.g., to determine if the user is in a sedentary state or a non-sedentary state).
- the detection module 314 may apply thresholds to the activity and vital sign information to classify the user's activity. For example, if the user's mean heart rate was above a certain rate for a certain time period and the user's average movement was above a certain rate for the same time period, the detection module 314 may determine the user is in a non-sedentary state. If we assume the accelerometer signals x[n], y[n] and z[n] per Eq. 1 are normalized in such a way that the norm of the accelerometer corresponds to 1 g (i.e., the gravitational constant), then a suitable threshold for the activity metric of Eq. 1 when measured on the wrist is 1 e-3. A suitable threshold for the mean heart rate may be 85 bpm.
- the detection module 314 may determine the user is in a sedentary state. Similarly, if both the vital sign information and the activity information are below a certain level the detection module 314 may determine the user is in a sedentary state.
- the detection module 314 may issue an alert. Typically a sedentary bout on the order of 15 to 60 minutes may be used for triggering an alert in line with recent health recommendations.
- the threshold value for an alert may be reduced. For example, if the initial threshold for a single sedentary bout was set to 60 minutes, after a first alert the alert threshold for subsequent bouts may be reduced to, e.g., 45 minutes or less. This alert may be issued to the user himself and suggest him to perform a physical activity (such as going for a quick walk or run). Additionally or alternatively, the alert may be issued to medical personnel who are monitoring the user's activity for various health reasons.
- the detection module 314 may also directly or indirectly take additional follow up actions in response to a determination that the user is in a sedentary or non-sedentary state. Such actions include the transmission of a message, the creation of a calendar entry for performing physical activity, a command to put a computer, television or electronic device into a standby or powered down mode, etc.
- Fig. 4 generally illustrates a user 400 using the apparatus in accordance with one embodiment of the invention.
- the source of vital sign information 410 and the source of movement information 420 are in operable communication with the user 400 to obtain vital sign and movement information, respectively.
- the source of vital sign information 410 and the source of movement information 420 are in operable communication with the detection module 430.
- the detection module 430 may be an app executing on a smartphone, a dedicated desktop or server computer, a cloud- based computer or the like at a location remote from the user, etc.
- the sources of vital sign information 410 and movement information 420 as well as the detection module 430 may be contained in a single device, such as a wrist- worn monitor.
- Embodiments combining sources of vital sign information 410 and movement information 420 in the same device have the advantage of eliminating communications between disparate devices.
- One such embodiment would be a wrist- worn device using PPG technology for heart rate information and accelerometers for motion estimation.
- Another such embodiment would be a chest-worn patch incorporating, e.g., an ECG sensor and an accelerometer.
- Embodiments utilizing PPG sensors may utilize an intermittent sampling strategy to reduce battery drain. Estimating the user's state on, e.g., a minute- by-minute basis may provide sufficient resolution and still require vital sign information with a relatively low data rate. Since heart rate itself does not change that quickly, intermittently sampling the PPG sensor, e.g., for 10 seconds of every minute, after which it would be switched off for another 50 seconds, should supply sufficient information to estimate heart rate information.
- the duration of the intermittent sampling period could be reduced or increased in response to the user's measured motion.
- the PPG sensor could be activated for additional measurements if, e.g., motion sensors indicated that the user was not moving.
- the sampling period could be reduced at high motion levels and increased at low motion levels.
- the detection module 430 may analyze the vital sign information and the movement information to determine which type of activity the user 400 is performing as discussed above. If, as discussed above, the detection module 430 determines the user has been in a sedentary state for a certain period of time such that the user's metabolic factors are at risk, the detection module 430 may issue an alert or take other action.
- An alert may be issued to the user's mobile device 440, for example.
- the mobile device 440 may be a fitness tracker, smartphone, laptop or desktop computer, tablet or the like and configured to provide an audio alert, a written alert, and/or a haptic-based alert.
- the user 400 may interpret this alert as an instruction to perform a physical activity.
- all necessary processing and analysis may be done on the user's mobile device 440. That is, all necessary computations (e.g., estimating the vital sign feature(s), estimating the activity feature(s), determining whether the user is in a sedentary state from information, features, etc.) may be performed on the user's mobile device 440 rather than at a remote location.
- all necessary computations e.g., estimating the vital sign feature(s), estimating the activity feature(s), determining whether the user is in a sedentary state from information, features, etc.
- the detection module 430 may additionally or alternatively be configured to issue an alert to medical personnel such as those working at a healthcare institution 450. These medical personnel may simply be monitoring the user's health and may be interested in knowing if and when the user is in non-sedentary and sedentary states.
- FIG. 5 illustrates a flowchart of a method 500 of classifying a user's activity in accord with the present invention.
- step 502 information regarding the user's vital signs is obtained. This information may be first gathered by vital sign sensors operably positioned with respect to the user's body. This information may be processed to estimate at least one vital sign feature as discussed above.
- step 504 information regarding the user's movement is obtained. This information may be gathered by movement sensor(s) operably positioned on at least one portion of the user's body. This information may then be processed to estimate at least one activity feature as discussed previously. Steps 502 and 504 may be performed simultaneously, contemporaneously, sequentially, etc. [0060] In step 506, the vital sign information, for example, the vital sign feature(s), is analyzed in conjunction with the acceleration information, for example, the activity feature(s) to more accurately classify the user's activity. In one embodiment, a classifier such as a naive Bayes classifier is used, which identifies a user's state as sedentary or non-sedentary based on probabilities.
- a classifier such as a naive Bayes classifier is used, which identifies a user's state as sedentary or non-sedentary based on probabilities.
- an action may be taken if it is determined that the user is in a sedentary state.
- An alert may be issued if, for example, it is determined that the user has been in a sedentary state for a certain period of time. The alert may be communicated directly to the user or to anyone else such as medical personnel.
- Other actions include the transmission of a message, the creation of a calendar entry for performing physical activity, a command to put a computer, television or electronic device into a standby or powered down mode, etc.
- the user may be more inclined to perform a physical activity.
- increasing the number of interruptions of sedentary time with activities may lead to health benefits, such as those associated with metabolic risk factors.
- Embodiments of the present disclosure are described above with reference to block diagrams and/or operational illustrations of methods, systems, and computer program products according to embodiments of the present disclosure.
- the functions/acts noted in the blocks may occur out of the order as shown in any flowchart.
- two blocks shown in succession may in fact be executed substantially concurrent or the blocks may sometimes be executed in the reverse order, depending upon the functionality/acts involved.
- not all of the blocks shown in any flowchart need to be performed and/or executed. For example, if a given flowchart has five blocks containing functions/acts, it may be the case that only three of the five blocks are performed and/or executed. In this example, any of the three of the five blocks may be performed and/or executed.
- a statement that a value exceeds (or is more than) a first threshold value is equivalent to a statement that the value meets or exceeds a second threshold value that is slightly greater than the first threshold value, e.g., the second threshold value being one value higher than the first threshold value in the resolution of a relevant system.
- a statement that a value is less than (or is within) a first threshold value is equivalent to a statement that the value is less than or equal to a second threshold value that is slightly lower than the first threshold value, e.g., the second threshold value being one value lower than the first threshold value in the resolution of the relevant system.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Physiology (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Engineering & Computer Science (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Cardiology (AREA)
- Physics & Mathematics (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Dentistry (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Measuring And Recording Apparatus For Diagnosis (AREA)
Abstract
Methods and apparatus for determining when a user is in a sedentary state. By analyzing information related to a user's vital sign(s) and the user's movement, embodiments of the present invention can more accurately classify a user's activity (e.g., detect when the user is in a sedentary or a non-sedentary state). This information may be used to trigger a variety of actions, including communications to medical personnel or communications directly to the user so the user may be incited to perform a physical activity.
Description
USER STATE CLASSIFICATION
TECHNICAL FIELD
[001] This invention generally relates to methods and devices for classifying a user's state and, in particular, classifying a user's state by analyzing the user's vital sign information in conjunction with the user's movement.
BACKGROUND
[002] A sedentary state is an ubiquitous part of present day lifestyles. Many people perform activities that typically require very low energy in a sitting or prone position. These activities may include reading, watching television, driving a vehicle, and doing office work.
[003] While moderate to vigorous intensity exercise provides numerous health benefits, recent studies suggest that simply interrupting sedentary periods with activity provides benefits related to metabolic risk variables. In other words, the interruptions of sedentary periods with physical activity may lead to benefits in addition to the benefits associated with the activity itself. This may be true regardless of the intensity of the activity, the duration of the activity, and the duration of the sedentary state.
[004] Wearable technologies have become an increasingly common way to monitor one's health. However, these devices are mainly used to monitor the wearer when engaged in moderate to vigorous intensity exercise and may also provide inaccurate feedback. For example, these types of devices may confuse wrist movements with exercise, even though the wearer may only be performing office work.
[005] Accordingly, there is a need for devices and methods that effectively distinguish between a user being in a sedentary or a non-sedentary state.
SUMMARY
[006] This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description section. This summary is not
intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
[007] Embodiments of the present invention relate to methods and apparatus for determining when the user is in a sedentary state. Making such determinations using, e.g., motion alone is difficult, as a wrist- worn accelerometer may make it seem that a user is active when they may just be performing desk work. Accordingly, embodiments of the present invention address this deficiency considering a source of vital sign information in tandem with motion data. By analyzing information related to a user's vital sign(s) and the user's movement, the apparatus can more accurately determine when a user is active or in a sedentary state. This information may be communicated to medical personnel, or directly to the user so the user may be aware of their sedentary state and be incited to perform a physical activity.
[008] In one aspect, embodiments of the present invention relate to an apparatus for classifying user state having a source of user vital sign information, a source of user movement information, and a detection module that is configured to analyze the vital sign information in conjunction with the user movement information to determine whether the user is in a sedentary state.
[009] In some embodiments of the apparatus, the vital sign information is the user's heart rate. In some embodiments, the vital sign information includes the mean (average) heart rate of the user. In some embodiments, the vital sign information includes the difference between the mean heart rate of the user and the resting heart rate of the user. In some embodiments, the vital sign information includes an estimate of the resting heart rate of the user based on the user's heart rate.
[0010] In some embodiments, the source of the user vital sign information includes a PPG sensor. In some embodiments, the source of user movement information includes an accelerometer that is operably connected to at least one portion of the user. In some embodiments, the user vital sign information is one or more of the user's heart rate, respiratory rate, etc.
[0011] In some embodiments, the apparatus further includes an alert device in operable communication with the detection module for issuing an alert that the user is in a sedentary state.
This alert may be issued when the user is in the sedentary state for a predetermined period of time, for example.
[0012] In some embodiments, the source of vital sign information includes a vital sign feature estimator for processing the vital sign information before analysis by the detection module. In some embodiments, the source of user movement information may include an activity feature estimator for processing the movement information before analysis by the detection module.
[0013] In another aspect, embodiments of the present invention relate to a method of classifying a state of a user. The method includes obtaining information that is generally related to at least one vital sign of the user, obtaining information that is generally related to movement of at least one portion of the user, analyzing the vital sign information in conjunction with the user movement information to determine whether the user is in a sedentary state, and issuing an alert when it is determined that the user is in a sedentary state.
[0014] In some embodiments, the vital sign information is the user's heart rate. In some embodiments, the vital sign information includes the mean (average) heart rate of the user. In some embodiments, the vital sign information includes the difference between the mean heart rate of the user and a resting heart rate of the user. In some embodiments, the vital sign information includes an estimate of a resting heart rate of the user based on the user's heart rate.
[0015] In some embodiments, the vital sign information is obtained from a PPG sensor. In some embodiments, the user movement information is obtained from an accelerometer operably connected to at least one portion of the user. In some embodiments, the vital sign information is one or more of the user's heart rate, respiratory rate, body temperature, and blood pressure.
[0016] In yet another aspect, embodiments of the present invention relate to an apparatus for classifying an activity of a user having a first device for obtaining information that is generally related to the user's heart rate, a heart rate feature estimator for estimating at least one heart rate feature based on the heart rate information, a second device for obtaining information that is generally related to movement of at least one portion of the user, an activity feature estimator for estimating at least one activity feature based on the movement information, and a sedentary state detection module to analyze the heart rate feature in conjunction with the activity feature to determine whether the user is in a sedentary state.
[0017] These and other features and advantages, which characterize the present non-limiting embodiments, will be apparent from a reading of the following detailed description and a review of the associated drawings. It is to be understood that both the foregoing general description and the following detailed description are explanatory only and are not restrictive of the non-limiting embodiments as claimed.
BRIEF DESCRIPTION OF DRAWINGS
[0018] The invention and embodiments thereof will be better understood when the following detailed description is read in conjunction with the accompanying drawing figures:
[0019] FIG. 1 generally illustrates movement data and heart rate data of a user in accordance with one embodiment of the invention;
[0020] FIG. 2 generally illustrates a user wearing a vital sign sensor device and acceleration sensor devices in accordance with one embodiment of the invention;
[0021] FIG. 3 schematically illustrates a block diagram of an apparatus in accordance with one embodiment of the invention;
[0022] FIG. 4 schematically illustrates an embodiment of the apparatus and a user in accordance with one embodiment of the invention; and
[0023] FIG. 5 generally illustrates a flow chart of a method of classifying the activity of a user in accordance with one embodiment of the invention.
[0024] In the drawings, like reference characters generally refer to corresponding parts throughout the different views. Elements are not necessarily drawn to scale, emphasis instead being placed on the principles and concepts of operation.
DETAILED DESCRIPTION
[0025] Various embodiments are described more fully below with reference to the accompanying drawings, which form a part hereof, and which show specific exemplary embodiments. However, the concepts of the present disclosure may be implemented in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided as part of a thorough and complete disclosure, to fully
convey the scope of the concepts, techniques and implementations of the present disclosure to those skilled in the art. Embodiments may be practiced as methods, systems or devices. Accordingly, embodiments may take the form of a hardware implementation, an entirely software implementation or an implementation combining software and hardware aspects. The following detailed description is, therefore, not to be taken in a limiting sense.
[0026] Reference in the specification to "one embodiment" or to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiments is included in at least one example implementation or technique in accordance with the present disclosure. The appearances of the phrase "in one embodiment" in various places in the specification are not necessarily all referring to the same embodiment.
[0027] Some portions of the description that follow are presented in terms of symbolic representations of operations on non-transient signals stored within a computer memory. These descriptions and representations are used by those skilled in the data processing arts to most effectively convey the substance of their work to others skilled in the art. Such operations typically require physical manipulations of physical quantities. Usually, though not necessarily, these quantities take the form of electrical, magnetic or optical signals capable of being stored, transferred, combined, compared and otherwise manipulated. It is convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like. Furthermore, it is also convenient at times, to refer to certain arrangements of steps requiring physical manipulations of physical quantities as modules or code devices, without loss of generality.
[0028] However, all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise as apparent from the following discussion, it is appreciated that throughout the description, discussions utilizing terms such as "processing" or "computing" or "calculating" or "determining" or "displaying" or the like, refer to the action and processes of a computer system, or similar electronic computing device, that manipulates and transforms data represented as physical (electronic) quantities within the computer system memories or registers or other such information storage, transmission or display devices. Portions of the present disclosure include processes and instructions that may be embodied in software, firmware or
hardware, and when embodied in software, may be downloaded to reside on and be operated from different platforms used by a variety of operating systems.
[0029] The present disclosure also relates to an apparatus for performing the operations herein. This apparatus may be specially constructed for the required purposes, or it may comprise a general-purpose computer selectively activated or reconfigured by a computer program stored in the computer. Such a computer program may be stored in a computer readable storage medium, such as, but is not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, magnetic-optical disks, read-only memories (ROMs), random access memories (RAMs), EPROMs, EEPROMs, magnetic or optical cards, application specific integrated circuits (ASICs), or any type of media suitable for storing electronic instructions, and each may be coupled to a computer system bus. Furthermore, the computers referred to in the specification may include a single processor or may be architectures employing multiple processor designs for increased computing capability.
[0030] The processes and displays presented herein are not inherently related to any particular computer or other apparatus. Various general-purpose systems may also be used with programs in accordance with the teachings herein, or it may prove convenient to construct more specialized apparatus to perform one or more method steps. The structure for a variety of these systems is discussed in the description below. In addition, any particular programming language that is sufficient for achieving the techniques and implementations of the present disclosure may be used. A variety of programming languages may be used to implement the present disclosure as discussed herein.
[0031] In addition, the language used in the specification has been principally selected for readability and instructional purposes and may not have been selected to delineate or circumscribe the disclosed subject matter. Accordingly, the present disclosure is intended to be illustrative, and not limiting, of the scope of the concepts discussed herein.
[0032] FIG. 1 generally shows graphical illustrations of a user's movement information 110 and the user's heart rate information 120 in accordance with one embodiment of the invention. The movement information may be obtained from one or more accelerometers placed on at least one portion of the user's body (e.g., the user's wrist(s)). The user's heart rate information is in
beats per minute (bpm) and may be obtained from a heart rate sensor device such as a photoplethysmogram (PPG) sensor or an electrocardiogram (ECG) sensor.
[0033] The graphical illustrations 110 and 120 show information regarding the user's acceleration and heart rate in substantially real time over the same time period. As indicated on both illustrations 110 and 120, the user starts in a non-sedentary state, for example, sprinting, at roughly the 525 second mark (indicated by the "activity start" label). The performance of this activity is accompanied by a larger amount of acceleration and deceleration (as shown in illustration 110) and an increase in the user's heart rate (as shown in illustration 120).
[0034] Based on the illustrations 110 and 120, the user performs this activity for roughly 50 seconds. When the user stops performing this activity (indicated by the "activity stop" label), the amount of acceleration and the user's heart rate decrease to pre-activity levels. That is, substantially no acceleration and a decrease in heart rate to levels that suggest the user is in a sedentary state.
[0035] By analyzing the heart rate information in conjunction with the movement information, embodiments of the present invention can more accurately determine when the user is in a sedentary state (or a non-sedentary state). If, on the other hand, only acceleration information was considered, it may appear the user is in a non-sedentary state when the user may simply be traveling in an automobile, for example. Alternatively, if only heart rate information was considered, it may appear the user is in a non-sedentary state when the user may simply be experiencing stress at work, having an argument, or watching a suspenseful movie (all typically sedentary activities).
[0036] FIG. 2 generally illustrates a user 200 wearing sensor devices in accordance with one embodiment of the invention. The user 200 is wearing accelerometers 202 on his wrists, as well as a heart rate sensor 204 on his chest. The positions of the accelerometers 202 on the user 200 may of course vary as long as the accelerometers 202 can accurately obtain information regarding the user's movement. The position of the heart rate sensor 204 may vary as long as the heart rate sensor 204 can accurately obtain information regarding the user's heart rate. In other embodiments, the accelerometer(s) 202, the heart rate sensor 204, and the related computing hardware performing user state classification are all contained in a single device, such as a wrist- worn monitor.
[0037] FIG. 3 schematically illustrates an apparatus 300 in accordance with one embodiment of the invention. The apparatus 300 includes a source of vital sign information 310, a source of movement information 312, and a detection module 314. The detection module 314 may also be configured to issue an alert regarding the user's activity to the user and/or medical personnel and/or other parties, as discussed below.
[0038] The source of vital sign information 310 may include one or more vital sign sensors 316 and a vital sign feature estimator 318. The vital sign sensors 316 may include not only heart rate sensors, but also sensors to obtain information related to the user's respiratory rate, body temperature, blood pressure, and other physiological measures. Like heart rate, these vital signs may elevate when the user in a non-sedentary state and then decrease when the user is in a sedentary state.
[0039] The raw data obtained by the vital sign sensor(s) 316 may be communicated to the vital sign feature estimator 318. The vital sign feature estimator 318 may derive at least one vital sign feature based on the raw data regarding the vital sign(s). For example, the vital sign feature estimator 318 may calculate various derivatives of the measured physiological data, including but not limited to simple averages (i.e., a mean(s)), weighted averages, standard deviations, etc. In some embodiments, the feature estimator 318 may be implemented as a single pole filter, a multipole filter, a Kalman filter, etc.
[0040] Similarly, the vital sign feature estimator 318 may take into account resting or baseline values for the various vital signs measured. A user may, for example, input their resting heart rate into the apparatus (e.g., via a mobile device such as a smartphone or the like) and the vital sign feature estimator may take this value into account when estimating the vital sign feature. The apparatus may then subtract the user's resting heart rate from the user's mean heart rate over a certain period of time to more estimate the vital sign feature. The vital sign feature estimator 318 may also estimate the user's resting heart rate based on, e.g., the information obtained by the vital sign sensors 316 during a sedentary period to more accurately estimate the vital sign feature.
[0041] The source of movement information 312 may include one or more movement sensors 320 and an activity feature estimator 322. The movement sensors 320 may be accelerometers positioned on various portions of the user's body. For example, there may be an
accelerometer positioned on each of the user's wrists (as illustrated in Fig. 2) as well as on the user's legs. The positions of the movement sensor(s) 320 may of course vary as long as they can obtain accurate information regarding the user's movement. Other suitable movement sensors include, but are not limited to magnetometers, gyroscopes, a combination of movement sensors, etc.
[0042] The raw data obtained by the movement sensor(s) 320 may be communicated to the activity feature estimator 322. The activity feature estimator 322 may derive at least one activity feature based on the raw data regarding the user's movement. For example, the activity feature estimator 322 may calculate various derivatives of the measured movement data, including but not limited to simple averages (i.e., a mean(s)), weighted averages, standard deviations, etc. In some embodiments, the activity feature estimator 322 may be implemented as a single pole filter, a multipole filter, a Kalman filter, etc.
[0043] One example activity feature can be the unbiased power of the accelerometer captured over an interval of time. Assume a three dimensional accelerometer signal described by the three axes x[n], y[n] and z[n], where n indicates the sample index, sampled at a certain sampling frequency fs, e.g., fs = 100Hz. Then, a measure of the amount of activity for each second of data, indicated by index k, may be calculated as:
For a wrist- worn source of motion data, the disadvantage of this metric is that it will also provide relatively high values when a person is moving his/her wrist, whereas he/she may still be sedentary, e.g. during desk work. Therefore, using this type of metric alone, it can be difficult to distinguish whether a user is sedentary.
[0044] The vital sign information and the activity information may be communicated to the detection module 314. The detection module 314 may analyze the vital sign information in conjunction with the activity information to classify the user's activity (e.g., to determine if the user is in a sedentary state or a non-sedentary state).
[0045] In its simplest form, the detection module 314 may apply thresholds to the activity and vital sign information to classify the user's activity. For example, if the user's mean heart rate was above a certain rate for a certain time period and the user's average movement was above a certain rate for the same time period, the detection module 314 may determine the user is in a non-sedentary state. If we assume the accelerometer signals x[n], y[n] and z[n] per Eq. 1 are normalized in such a way that the norm of the accelerometer corresponds to 1 g (i.e., the gravitational constant), then a suitable threshold for the activity metric of Eq. 1 when measured on the wrist is 1 e-3. A suitable threshold for the mean heart rate may be 85 bpm.
[0046] On the other hand, if only the user's activity information was above a certain value for a period of time (but the user's vital sign information was not), the detection module 314 may determine the user is in a sedentary state. Similarly, if both the vital sign information and the activity information are below a certain level the detection module 314 may determine the user is in a sedentary state.
[0047] If the detection module 314 determines the user is in a sedentary state (e.g., both the vital sign features and activity feature are below a threshold for a certain period of time), the detection module 314 may issue an alert. Typically a sedentary bout on the order of 15 to 60 minutes may be used for triggering an alert in line with recent health recommendations. In addition, if multiple sedentary bouts are detected, the threshold value for an alert may be reduced. For example, if the initial threshold for a single sedentary bout was set to 60 minutes, after a first alert the alert threshold for subsequent bouts may be reduced to, e.g., 45 minutes or less. This alert may be issued to the user himself and suggest him to perform a physical activity (such as going for a quick walk or run). Additionally or alternatively, the alert may be issued to medical personnel who are monitoring the user's activity for various health reasons.
[0048] The detection module 314 may also directly or indirectly take additional follow up actions in response to a determination that the user is in a sedentary or non-sedentary state. Such actions include the transmission of a message, the creation of a calendar entry for performing
physical activity, a command to put a computer, television or electronic device into a standby or powered down mode, etc.
[0049] Fig. 4 generally illustrates a user 400 using the apparatus in accordance with one embodiment of the invention. The source of vital sign information 410 and the source of movement information 420 are in operable communication with the user 400 to obtain vital sign and movement information, respectively.
[0050] The source of vital sign information 410 and the source of movement information 420 are in operable communication with the detection module 430. In various embodiments, the detection module 430 may be an app executing on a smartphone, a dedicated desktop or server computer, a cloud- based computer or the like at a location remote from the user, etc. In other embodiments, the sources of vital sign information 410 and movement information 420 as well as the detection module 430 may be contained in a single device, such as a wrist- worn monitor.
[0051] Embodiments combining sources of vital sign information 410 and movement information 420 in the same device have the advantage of eliminating communications between disparate devices. One such embodiment would be a wrist- worn device using PPG technology for heart rate information and accelerometers for motion estimation. Another such embodiment would be a chest-worn patch incorporating, e.g., an ECG sensor and an accelerometer.
[0052] Embodiments utilizing PPG sensors may utilize an intermittent sampling strategy to reduce battery drain. Estimating the user's state on, e.g., a minute- by-minute basis may provide sufficient resolution and still require vital sign information with a relatively low data rate. Since heart rate itself does not change that quickly, intermittently sampling the PPG sensor, e.g., for 10 seconds of every minute, after which it would be switched off for another 50 seconds, should supply sufficient information to estimate heart rate information.
[0053] In various embodiments, the duration of the intermittent sampling period could be reduced or increased in response to the user's measured motion. For example, the PPG sensor could be activated for additional measurements if, e.g., motion sensors indicated that the user was not moving. Similarly, the sampling period could be reduced at high motion levels and increased at low motion levels.
[0054] The detection module 430 may analyze the vital sign information and the movement information to determine which type of activity the user 400 is performing as discussed above. If, as discussed above, the detection module 430 determines the user has been in a sedentary state for a certain period of time such that the user's metabolic factors are at risk, the detection module 430 may issue an alert or take other action.
[0055] An alert may be issued to the user's mobile device 440, for example. The mobile device 440 may be a fitness tracker, smartphone, laptop or desktop computer, tablet or the like and configured to provide an audio alert, a written alert, and/or a haptic-based alert. The user 400 may interpret this alert as an instruction to perform a physical activity.
[0056] It is also contemplated that all necessary processing and analysis may be done on the user's mobile device 440. That is, all necessary computations (e.g., estimating the vital sign feature(s), estimating the activity feature(s), determining whether the user is in a sedentary state from information, features, etc.) may be performed on the user's mobile device 440 rather than at a remote location.
[0057] The detection module 430 may additionally or alternatively be configured to issue an alert to medical personnel such as those working at a healthcare institution 450. These medical personnel may simply be monitoring the user's health and may be interested in knowing if and when the user is in non-sedentary and sedentary states.
[0058] FIG. 5 illustrates a flowchart of a method 500 of classifying a user's activity in accord with the present invention. In step 502, information regarding the user's vital signs is obtained. This information may be first gathered by vital sign sensors operably positioned with respect to the user's body. This information may be processed to estimate at least one vital sign feature as discussed above.
[0059] In step 504, information regarding the user's movement is obtained. This information may be gathered by movement sensor(s) operably positioned on at least one portion of the user's body. This information may then be processed to estimate at least one activity feature as discussed previously. Steps 502 and 504 may be performed simultaneously, contemporaneously, sequentially, etc.
[0060] In step 506, the vital sign information, for example, the vital sign feature(s), is analyzed in conjunction with the acceleration information, for example, the activity feature(s) to more accurately classify the user's activity. In one embodiment, a classifier such as a naive Bayes classifier is used, which identifies a user's state as sedentary or non-sedentary based on probabilities. Both vital sign feature(s), e.g., heart rate, and activity feature(s), e.g., a movement- related metric, individually return probabilities as to whether the subject is sedentary or not. These individual probabilities can be combined into a single probability as to whether the subject is sedentary or not. If the probability that the subject is sedentary is higher than the probability that the subject is not sedentary, then the current epoch of data may be classified as associated with a sedentary state and vice versa.
[0061] In step 508, an action may be taken if it is determined that the user is in a sedentary state. An alert may be issued if, for example, it is determined that the user has been in a sedentary state for a certain period of time. The alert may be communicated directly to the user or to anyone else such as medical personnel. Other actions include the transmission of a message, the creation of a calendar entry for performing physical activity, a command to put a computer, television or electronic device into a standby or powered down mode, etc.
[0062] After the action is taken, the user may be more inclined to perform a physical activity. As stated previously, increasing the number of interruptions of sedentary time with activities may lead to health benefits, such as those associated with metabolic risk factors.
[0063] Various modifications of the exemplary embodiments described above will be apparent to those skilled in the art. For example other types of sensors may be used to gather information regarding the user's vital signs, movement, and acceleration. Similarly, the thresholds for determining whether the user is in a sedentary state may vary, as well as the features used to make that determination.
[0064] In the claims, the use of the verb "comprise" and its conjugations does not exclude the presence of elements or steps other than those stated in a claim. The article "a" or "an" preceding an element does not exclude the presence of a plurality of such elements.
[0065] The methods, systems, and devices discussed above are examples. Various configurations may omit, substitute, or add various procedures or components as appropriate. For instance, in alternative configurations, the methods may be performed in an order different
from that described, and that various steps may be added, omitted, combined, or performed contemporaneously. Also, features described with respect to certain configurations may be combined in various other configurations. Different aspects and elements of the configurations may be combined in a similar manner. Also, technology evolves and, thus, many of the elements are examples and do not limit the scope of the disclosure or claims.
[0066] Embodiments of the present disclosure, for example, are described above with reference to block diagrams and/or operational illustrations of methods, systems, and computer program products according to embodiments of the present disclosure. The functions/acts noted in the blocks may occur out of the order as shown in any flowchart. For example, two blocks shown in succession may in fact be executed substantially concurrent or the blocks may sometimes be executed in the reverse order, depending upon the functionality/acts involved. Additionally, or alternatively, not all of the blocks shown in any flowchart need to be performed and/or executed. For example, if a given flowchart has five blocks containing functions/acts, it may be the case that only three of the five blocks are performed and/or executed. In this example, any of the three of the five blocks may be performed and/or executed.
[0067] A statement that a value exceeds (or is more than) a first threshold value is equivalent to a statement that the value meets or exceeds a second threshold value that is slightly greater than the first threshold value, e.g., the second threshold value being one value higher than the first threshold value in the resolution of a relevant system. A statement that a value is less than (or is within) a first threshold value is equivalent to a statement that the value is less than or equal to a second threshold value that is slightly lower than the first threshold value, e.g., the second threshold value being one value lower than the first threshold value in the resolution of the relevant system.
[0068] Specific details are given in the description to provide a thorough understanding of example configurations (including implementations). However, configurations may be practiced without these specific details. For example, well-known circuits, processes, algorithms, structures, and techniques have been shown without unnecessary detail in order to avoid obscuring the configurations. This description provides example configurations only, and does not limit the scope, applicability, or configurations of the claims. Rather, the preceding description of the configurations will provide those skilled in the art with an enabling description
for implementing described techniques. Various changes may be made in the function and arrangement of elements without departing from the spirit or scope of the disclosure.
[0069] Having described several example configurations, various modifications, alternative constructions, and equivalents may be used without departing from the spirit of the disclosure. For example, the above elements may be components of a larger system, wherein other rules may take precedence over or otherwise modify the application of various implementations or techniques of the present disclosure. Also, a number of steps may be undertaken before, during, or after the above elements are considered.
[0070] Having been provided with the description and illustration of the present application, one skilled in the art may envision variations, modifications, and alternate embodiments falling within the general inventive concept discussed in this application that do not depart from the scope of the following claims.
Claims
1. An apparatus for classifying user state, the apparatus comprising:
a source of user vital sign information;
a source of user movement information; and
a detection module that is configured to analyze the vital sign information in conjunction with the user movement information to determine whether the user is in a sedentary state.
2. The apparatus of claim 1, wherein the vital sign information is the user's heart rate.
3. The apparatus of claim 2, wherein the vital sign information includes a mean heart rate of the user.
4. The apparatus of claim 2, wherein the vital sign information includes the difference between a mean heart rate of the user and a resting heart rate of the user.
5. The apparatus of claim 2, wherein the vital sign information includes an estimate of a resting heart rate of the user based on the user's heart rate.
6. The apparatus of claim 2, wherein the source of user vital sign information includes a PPG sensor.
7. The apparatus of claim 1, wherein the source of user movement information includes an accelerometer that is operably connected to at least one portion of the user.
8. The apparatus of claim 1, wherein the user vital sign information is one or more of heart rate, respiratory rate, body temperature, and blood pressure.
9. The apparatus of claim 1, further comprising an alert device in operable communication with the sedentary state detection module for communicating an alert that the user is in a sedentary state.
10. The apparatus of claim 9, wherein the alert device communicates the alert when the user is in the sedentary state for a predetermined period of time.
11. The apparatus of claim 1, wherein the source of vital sign information includes a vital sign feature estimator for processing the vital sign information and the source of user movement information includes an activity feature estimator for processing the movement information before analysis by the detection module.
12. A method of classifying an activity of a user, the method comprising:
obtaining information that is generally related to at least one vital sign of the user;
obtaining information that is generally related to acceleration of at least one portion of the user;
analyzing the vital sign information in conjunction with the user movement information to determine whether the user is in a sedentary state; and
issuing an alert when it is determined that the user is in a sedentary state.
13. The method of claim 12, wherein the vital sign information is the user's heart rate.
14. The method of claim 13, wherein the vital sign information includes a mean heart rate of the user.
15. The method of claim 13, wherein the vital sign information includes the difference between a mean heart rate of the user and a resting heart rate of the user.
16. The method of claim 13, wherein the vital sign information includes an estimate of a resting heart rate of the user based on the user's heart rate.
17. The method of claim 13, wherein the vital sign information is obtained from a PPG sensor.
18. The method of claim 12, wherein the user movement information is obtained from an accelerometer operably connected to the at least one portion of the user.
19. The method of claim 12, wherein the user vital sign information is one or more of heart rate, respiratory rate, body temperature, and blood pressure.
20. An apparatus for classifying an activity of a user, the apparatus comprising:
a first device for obtaining information that is generally related to the user's heart rate;
a heart rate feature estimator in operable communication with the first device for estimating at least one heart rate feature based on the information that is generally related to the user's heart rate;
a second device for obtaining information that is generally related to movement of at least one portion of the user;
an activity feature estimator in operable communication with the second device for estimating at least one activity feature based on the information that is generally related to the movement of the at least one portion of the user; and
a sedentary state detection module that is configured to analyze the at least one heart rate feature in conjunction with the at least one activity feature to determine whether the user is in a sedentary state.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US201462058130P | 2014-10-01 | 2014-10-01 | |
| US62/058,130 | 2014-10-01 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2016051379A1 true WO2016051379A1 (en) | 2016-04-07 |
Family
ID=54337833
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/IB2015/057532 Ceased WO2016051379A1 (en) | 2014-10-01 | 2015-10-01 | User state classification |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2016051379A1 (en) |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2018092398A1 (en) * | 2016-11-15 | 2018-05-24 | ソニー株式会社 | Information processing device and program |
| CN115969346A (en) * | 2023-01-31 | 2023-04-18 | 深圳市爱都科技有限公司 | Sedentary detection method, device, equipment and medium |
Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US8460197B1 (en) * | 2011-06-13 | 2013-06-11 | Impact Sports Technologies, Inc. | Monitoring device with a pedometer |
| US20140121471A1 (en) * | 2012-10-26 | 2014-05-01 | Nike, Inc. | Athletic Performance Monitoring System Utilizing Heart Rate Information |
-
2015
- 2015-10-01 WO PCT/IB2015/057532 patent/WO2016051379A1/en not_active Ceased
Patent Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US8460197B1 (en) * | 2011-06-13 | 2013-06-11 | Impact Sports Technologies, Inc. | Monitoring device with a pedometer |
| US20140121471A1 (en) * | 2012-10-26 | 2014-05-01 | Nike, Inc. | Athletic Performance Monitoring System Utilizing Heart Rate Information |
Non-Patent Citations (2)
| Title |
|---|
| CHEN SHANSHAN ET AL: "Unsupervised activity clustering to estimate energy expenditure with a single body sensor", 2013 IEEE INTERNATIONAL CONFERENCE ON BODY SENSOR NETWORKS, IEEE, 6 May 2013 (2013-05-06), pages 1 - 6, XP032474391, ISSN: 2325-1425, ISBN: 978-1-4799-0331-3, [retrieved on 20130806], DOI: 10.1109/BSN.2013.6575500 * |
| FERNANDO GARCÃA-GARCÃA ET AL: "Statistical Machine Learning for Automatic Assessment of Physical Activity Intensity Using Multi-axial Accelerometry and Heart Rate", 2 July 2011, ARTIFICIAL INTELLIGENCE IN MEDICINE, SPRINGER BERLIN HEIDELBERG, BERLIN, HEIDELBERG, PAGE(S) 70 - 79, ISBN: 978-3-642-22217-7, XP019164406 * |
Cited By (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2018092398A1 (en) * | 2016-11-15 | 2018-05-24 | ソニー株式会社 | Information processing device and program |
| US10863925B2 (en) | 2016-11-15 | 2020-12-15 | Sony Corporation | Information processing device |
| CN115969346A (en) * | 2023-01-31 | 2023-04-18 | 深圳市爱都科技有限公司 | Sedentary detection method, device, equipment and medium |
| CN115969346B (en) * | 2023-01-31 | 2024-05-28 | 深圳市爱都科技有限公司 | Sedentary detection method, sedentary detection device, sedentary detection equipment and sedentary detection medium |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US10390730B1 (en) | Methods, systems, and devices for determining a respiration rate | |
| US9766370B2 (en) | Wrist-detection algorithm | |
| US10383322B2 (en) | Fishing and sailing activity detection | |
| US20180242907A1 (en) | Determining metabolic parameters using wearables | |
| US20180116607A1 (en) | Wearable monitoring device | |
| US11504068B2 (en) | Methods, systems, and media for predicting sensor measurement quality | |
| US10945675B2 (en) | Determining a health status for a user | |
| US20160374578A1 (en) | Contextual heart health monitoring with integrated ecg (electrocardiogram) | |
| EP2845539B1 (en) | Device and method for automatically normalizing the physiological signals of a living being | |
| BR112021005414A2 (en) | system and method for integrating emotion data on the social networking platform and sharing the emotion data on the social networking platform | |
| EP3393345B1 (en) | Method and apparatus for detecting live tissues using signal analysis | |
| WO2016051379A1 (en) | User state classification | |
| Aras et al. | GreenMonitor: Extending battery life for continuous heart rate monitoring in smartwatches | |
| US20230106138A1 (en) | System and Method for Detecting and Predicting Surgical Wound Infections | |
| US20230113324A1 (en) | Medical information processing device, medical information processing method, and storage medium | |
| US20190074076A1 (en) | Health habit management | |
| Huang et al. | Efficient fall detection method using time-of-flight sensors and decision tree model | |
| JP7180259B2 (en) | Biological information analysis device, biological information analysis method, and biological information analysis system | |
| US20220015717A1 (en) | Activity State Analysis Device, Activity State Analysis Method and Activity State Analysis System | |
| Amor et al. | Detecting and analyzing activity levels for the wrist wearable unit in the USEFIL project | |
| US20240194359A1 (en) | Methods and systems for transmitting medical information according to prioritization criteria | |
| EP3838141A1 (en) | System and method to determine if a device is worn on a subject's dominant limb | |
| JP2020010881A (en) | Detection apparatus, wearable sensing device, detection method, and program | |
| EP4637548A1 (en) | Establishing optimal aggregation of data in signals generated in free-living scenarios | |
| WO2025178632A1 (en) | Biometric measurement application for measuring activity level |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15784160 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 15784160 Country of ref document: EP Kind code of ref document: A1 |