US20230084356A1 - Context Aware Fall Detection Using a Mobile Device - Google Patents
Context Aware Fall Detection Using a Mobile Device Download PDFInfo
- Publication number
- US20230084356A1 US20230084356A1 US17/942,018 US202217942018A US2023084356A1 US 20230084356 A1 US20230084356 A1 US 20230084356A1 US 202217942018 A US202217942018 A US 202217942018A US 2023084356 A1 US2023084356 A1 US 2023084356A1
- Authority
- US
- United States
- Prior art keywords
- user
- mobile device
- sensor data
- likelihood
- fallen
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000001514 detection method Methods 0.000 title 1
- 238000000034 method Methods 0.000 claims abstract description 53
- 238000012545 processing Methods 0.000 claims abstract description 25
- 238000004891 communication Methods 0.000 claims description 41
- 210000000707 wrist Anatomy 0.000 claims description 33
- 230000001133 acceleration Effects 0.000 claims description 18
- 230000008859 change Effects 0.000 claims description 15
- 230000004044 response Effects 0.000 claims description 8
- 230000000694 effects Effects 0.000 description 47
- 230000033001 locomotion Effects 0.000 description 33
- 230000008569 process Effects 0.000 description 24
- 238000005259 measurement Methods 0.000 description 22
- 230000006870 function Effects 0.000 description 17
- 230000015654 memory Effects 0.000 description 15
- 238000010586 diagram Methods 0.000 description 13
- 230000002093 peripheral effect Effects 0.000 description 7
- 238000004590 computer program Methods 0.000 description 6
- 230000000875 corresponding effect Effects 0.000 description 5
- 230000006855 networking Effects 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 5
- 238000004458 analytical method Methods 0.000 description 4
- 230000001413 cellular effect Effects 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 4
- 230000007704 transition Effects 0.000 description 4
- 230000009471 action Effects 0.000 description 3
- 230000000386 athletic effect Effects 0.000 description 3
- 208000027418 Wounds and injury Diseases 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 2
- 230000006378 damage Effects 0.000 description 2
- 210000004247 hand Anatomy 0.000 description 2
- 208000014674 injury Diseases 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 230000035945 sensitivity Effects 0.000 description 2
- 210000001015 abdomen Anatomy 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000000981 bystander Effects 0.000 description 1
- 230000010267 cellular communication Effects 0.000 description 1
- 230000000739 chaotic effect Effects 0.000 description 1
- 230000009194 climbing Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000037081 physical activity Effects 0.000 description 1
- 230000010076 replication Effects 0.000 description 1
- 230000011664 signaling Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000010897 surface acoustic wave method Methods 0.000 description 1
- 230000002459 sustained effect Effects 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 239000002699 waste material Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B21/00—Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
- G08B21/02—Alarms for ensuring the safety of persons
- G08B21/04—Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
- G08B21/0407—Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons based on behaviour analysis
- G08B21/043—Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons based on behaviour analysis detecting an emergency event, e.g. a fall
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B21/00—Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
- G08B21/02—Alarms for ensuring the safety of persons
- G08B21/04—Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
- G08B21/0438—Sensor means for detecting
- G08B21/0446—Sensor means for detecting worn on the body to detect changes of posture, e.g. a fall, inclination, acceleration, gait
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B25/00—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
- G08B25/01—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium
- G08B25/016—Personal emergency signalling and security systems
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B25/00—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
- G08B25/001—Alarm cancelling procedures or alarm forwarding decisions, e.g. based on absence of alarm confirmation
Definitions
- the disclosure relates to systems and methods for determining whether a user has fallen using a mobile device.
- a motion sensor is a device that measures the motion experienced by an object (e.g., the velocity or acceleration of the object with respect to time, the orientation or change in orientation of the object with respect to time, etc.).
- a mobile device e.g., a cellular phone, a smart phone, a tablet computer, a wearable electronic device such as a smart watch, etc.
- a mobile device can include one or more motion sensors that determine the motion experienced by the mobile device over a period of time. If the mobile device is worn by a user, the measurements obtained by the motion sensor can be used to determine the motion experienced by the user over the period of time.
- Systems, methods, devices and non-transitory, computer-readable media are disclosed for electronically determining whether a user has fallen using a mobile device.
- a method includes: receiving, by a mobile device, sensor data obtained by one or more sensor over a time period, where the one or more sensors are worn by a user; determining, by the mobile device, a context of the user based on the sensor data; obtaining, by the mobile device based on the context, a set of rules for processing the sensor data, where the set of rules is specific to the context; determining, by the mobile device, at least one of a likelihood that the user has fallen or a likelihood that the user requires assistance based on the sensor data and the set of rules; and generating, by the mobile device, one or more notifications based on at least one of the likelihood that the user has fallen or the likelihood that the user requires assistance.
- Implementations of this aspect can include one or more of the following features.
- the sensor data can include location data obtained by one or more location sensors of the mobile device.
- the sensor data can include acceleration data obtained by one or more acceleration sensors of the mobile device.
- the sensor data can include orientation data obtained by one or more orientation sensors of the mobile device.
- the context can correspond to the user bicycling during the time period.
- determining the likelihood that the user has fallen and/or the likelihood that the user requires assistance can include: determining, based on the sensor data, that a distance traveled by the user prior over the period of time is greater than a first threshold value; determining, based on the sensor data, that a variation in a direction of impacts experienced by the user over the period of time is less than a second threshold value; determining, based on the sensor data, that a rotation of the user's wrist over the period of time is less than a third threshold value; and determining that the user has fallen and/or requires assistance based on the determination that the distance traveled by the user prior over the period of time is greater than the first threshold value, the determination that the variation in a direction of impacts experienced by the user over the period of time is less than the second threshold value, and the determination that the rotation of the user's wrist over the period of time is less than the third threshold value.
- determining the likelihood that the user has fallen and/or the likelihood that the user requires assistance can include: determining, based on the sensor data, that a magnitude of an impact experienced by the user over the period of time in a first direction is greater than a first threshold value; and determining that the user has fallen and/or requires assistance based on the determination that the magnitude of the impact experienced by the user over the period of time in the first direction is greater than the first threshold value.
- determining the likelihood that the user has fallen and/or the likelihood that the user requires assistance can include: determining, based on the sensor data, that a change in an orientation of the user's hand over the period of time is greater than a first threshold value; determining, based on the sensor data, that a magnitude of an impact experienced by the user over the period of time in a first direction is greater than a second threshold value, where the first direction is orthogonal to the second threshold value; determining, based on the sensor data, that a magnitude of an impact experienced by the user over the period of time in a second direction is greater than a third threshold value; and determining that the user has fallen and/or requires assistance based on the determination that the change in an orientation of the user's hand over the period of time is greater than the first threshold value, the determination that the magnitude of the impact experienced by the user over the period of time in the first direction is greater than the second threshold value, and the determination that the magnitude of an impact experienced by the user over the period of time in the second direction is greater than
- the method can further include: receiving, by the mobile device, second sensor data obtained by the one or more sensor over a second time period; determining, by the mobile device, a second context of the user based on the second sensor data; obtaining, by the mobile device based on the second context, a second set of rules for processing the sensor data, where the second set of rules is specific to the second context; determining, by the mobile device, at least one of a likelihood that the user has fallen and/or a likelihood that the user requires assistance based on the sensor data and the second set of rules; and generating, by the mobile device, one or more second notifications based on at least one of the likelihood that the user has fallen or the likelihood that the user requires assistance.
- the second context can correspond to the user walking during the second time period.
- the second context can correspond to the user playing at least one of basketball or volleyball during the second time period.
- generating the one or more notifications can include: transmitting a first notification to a communications device remote from the mobile device, the first notification including an indication that the user has fallen.
- the communications device can be an emergency response system.
- the mobile device can be a wearable mobile device.
- At least some of the one or more sensors can be disposed on or in the mobile device.
- At least some of the one or more sensors can be remote from the mobile device.
- FIG. 1 is a diagram of an example system for determining whether a user has fallen and/or may be in need of assistance.
- FIG. 2 A is a diagram showing an example position of a mobile device on a user's body.
- FIG. 2 B is a diagram showing example directional axes with respect a mobile device.
- FIG. 3 is a diagram of an example state machine for determining whether a user has fallen and/or requires assistance
- FIGS. 4 A and 4 B are diagrams of example sensor data obtained by a mobile device.
- FIG. 5 is a diagram of an example bicycle and a user wearing a mobile device.
- FIGS. 6 A and 6 B are diagrams of additional example sensor data obtained by a mobile device.
- FIG. 7 is a diagram of another example bicycle and a user wearing a mobile device.
- FIG. 8 is a flow char diagram of an example process for generating and transmitting notifications.
- FIGS. 9 A- 9 C are diagrams of example alert notification generated by a mobile device.
- FIG. 10 is a flow chart diagram of an example process for determining whether a user has fallen and/or requires assistance.
- FIG. 11 is a block diagram of an example architecture for implementing the features and processes described in reference to FIGS. 1 - 11 .
- FIG. 1 shows an example system 100 for determining whether a user has fallen and/or may be in need of assistance.
- the system 100 includes a mobile device 102 , a server computer system 104 , communications devices 106 , and a network 108 .
- the implementations described herein enable the system 100 to determine whether a user has fallen and/or whether the user may be in need of assistance more accurately, such that resources can be more effectively used. For instance, the system 100 can determine whether the user has fallen and/or whether the user may be in need of assistance with fewer false positives. Thus, the system 100 is less likely to consume computational and/or network resources to generate and transmit notifications to others when the user does not need assistance. Further, medical and logistical resources can be deployed to assist a user with a greater degree of confidence that they are needed, thereby reducing the likelihood of waste. Accordingly, resources can be consumed more efficiently, and in a manner that increases the effective response capacity of one or more systems (e.g., a computer system, a communications system, and/or an emergency response system).
- systems e.g., a computer system, a communications system, and/or an emergency response system.
- the mobile device 102 can be any portable electronic device for receiving, processing, and/or transmitting data, including but not limited to cellular phones, smart phones, tablet computers, wearable computers (e.g., smart watches), and the like.
- the mobile device 102 is communicatively connected to server computer system 104 and/or the communications devices 106 using the network 108 .
- the server computer system 104 is communicatively connected to mobile device 102 and/or the communications devices 106 using the network 108 .
- the server computer system 104 is illustrated as a respective single component. However, in practice, it can be implemented on one or more computing devices (e.g., each computing device including at least one processor such as a microprocessor or microcontroller).
- a server computer system 104 can be, for instance, a single computing device that is connected to the network 108 .
- the server computer system 104 can include multiple computing devices that are connected to the network 108 .
- the server computer system 104 need not be located locally to the rest of the system 100 , and portions of a server computer system 104 can be located in one or more remote physical locations.
- a communications device 106 can be any device that is used to transmit and/or receive information transmitted across the network 108 .
- Examples of the communications devices 106 include computers (such as desktop computers, notebook computers, server systems, etc.), mobile devices (such as cellular phones, smartphones, tablets, personal data assistants, notebook computers with networking capability), telephones, faxes, and other devices capable of transmitting and receiving data from the network 108 .
- the communications devices 106 can include devices that operate using one or more operating system (e.g., Apple iOS, Apple watchOS, Apple macOS, Microsoft Windows, Linux, Unix, Android, etc.) and/or architectures (e.g., x86, PowerPC, ARM, etc.) In some implementations, one or more of the communications devices 106 need not be located locally with respect to the rest of the system 100 , and one or more of the communications devices 106 can be located in one or more remote physical locations.
- operating system e.g., Apple iOS, Apple watchOS, Apple macOS, Microsoft Windows, Linux, Unix, Android, etc.
- architectures e.g., x86, PowerPC, ARM, etc.
- the network 108 can be any communications network through which data can be transferred and shared.
- the network 108 can be a local area network (LAN) or a wide-area network (WAN), such as the Internet.
- the network 108 can be a telephone or cellular communications network.
- the network 108 can be implemented using various networking interfaces, for instance wireless networking interfaces (such as Wi-Fi, Bluetooth, or infrared) or wired networking interfaces (such as Ethernet or serial connection).
- the network 108 also can include combinations of more than one network, and can be implemented using one or more networking interfaces.
- a user 110 can position the mobile device 102 on her body, and go about her daily life.
- the mobile device 102 can be a wearable electronic device or wearable computer (e.g., a smart watch), that is secured to a wrist 202 of the user 110 .
- the mobile device 102 can be secured to the user 110 , for example, through a band or strap 204 that encircles the wrist 202 .
- the orientation of the mobile device 102 can differ, depend on the location at which is it placed on the user's body and the user's positioning of her body.
- the orientation 206 of the mobile device 102 is shown in FIG. 2 A .
- the orientation 206 can refer, for example, to a vector projecting from a front edge of the mobile device 102 (e.g., the y-axis shown in FIG. 2 B ).
- the mobile device 102 can be any portable electronic device for receiving, processing, and/or transmitting data, including but not limited to cellular phones, smart phones, tablet computers, wearable computers (e.g., smart watches), and the like.
- the mobile device 102 can be implemented according to the architecture 300 shown and described with respect to FIG. 3 .
- the mobile device 102 can be positioned on other locations of a user's body (e.g., arm, shoulder, leg, hip, head, abdomen, hand, foot, or any other location).
- a user 110 positions the mobile device 102 on her body, and goes about her daily life. This can include, for example, walking, running, bicycling, sitting, laying down, participating in a sport or athletic activity (e.g., basketball, volleyball, etc.), or any other physical activity.
- the mobile device 102 collects sensor data regarding movement of the mobile device 102 , an orientation of the mobile device 102 , and/or other dynamic properties of the mobile device 102 and/or the user 110 .
- the mobile device 102 can measure an acceleration experienced by the motion sensors 310 , and correspondingly, the acceleration experienced by the mobile device 102 .
- the motion sensors 310 e.g., one or more compasses, gyroscopes, inertia measurement units, etc.
- the mobile device 102 can measure an orientation of the motion sensors 310 , and correspondingly, an orientation of the mobile device 102 .
- the motion sensors 310 can collect data continuously or periodically over a period of time or in response to a trigger event.
- the motion sensors 310 can collect motion data with respect to one or more specific directions relative to the orientation of the mobile device 102 .
- the motion sensors 310 can collect sensor data regarding an acceleration of the mobile device 102 with respect to the x-axis (e.g., a vector projecting from a side edge of the mobile device 102 , as shown in FIG. 2 B ), the y-axis (e.g., a vector projecting from a front edge of the mobile device 102 , as shown in FIG. 2 B ) and/or the z-axis (e.g., a vector projecting from a top surface or screen of the mobile device 102 , as shown in FIG. 2 B ), where the x-axis, y-axis, and z-axis refer to a Cartesian coordinate system in a frame of reference fixed to the mobile device 102 (e.g., a “body” frame).
- the system 100 determines whether the user 110 has fallen, and if so, whether the user 110 may be in need of assistance.
- the user 110 may stumble fall to the ground. Further, after falling, the user 110 may be unable to stand again on her own and/or may have suffered from an injury as a result of the fall. Thus, she may be in need of assistance, such as physical assistance in standing and/or recovering from the fall, medical attention to treat injuries sustained in the fall, or other help.
- the system 100 can automatically notify others of the situation.
- the mobile device 102 can generate and transmit a notification to one or more of the communications devices 106 to notify one or more users 112 (e.g., caretakers, physicians, medical responders, emergency contact persons, etc.) of the situation, such that they can take action.
- users 112 e.g., caretakers, physicians, medical responders, emergency contact persons, etc.
- the mobile device 102 can generate and transmit a notification to one or more bystanders in proximity to the user (e.g., by broadcasting a visual and/or auditory alert), such they can take action.
- the mobile device 102 can generate and transmit a notification to the server computer system 104 (e.g., to relay the notification to others and/or to store the information for future analysis).
- assistance can be rendered to the user 110 more quickly and effectively.
- the system 100 can determine that the user 110 has experienced an external force, but has not fallen and is not in need of assistance.
- the user 110 may experiences vibrations and/or jostling while riding a bicycle (e.g., due to roughness of a road or trail surface), but has not fallen and can continue biking without assistance from others.
- the user 110 may have experience impacts during an athletic activity (e.g., bumped by another user while playing basketball, struck a ball or the ground while playing volleyball, etc.), but has not fallen due to the impact and is able to recover without assistance from others. Accordingly, the system 100 can refrain from generating and transmitting a notification to others.
- the system 100 can determine that the user 110 has fallen, but that the user is not in need of assistance.
- the user 110 may have fallen as a part of an athletic activity (e.g., fallen while biking), but is able to recover without assistance from others. Accordingly, the system 100 can refrain from generating a notification and/or transmitting a notification to others.
- the system 100 can make these determinations based on sensor data obtained before, during, and/or after an impact experienced by the user 110 .
- the mobile device 102 can collect sensor data (e.g., acceleration data, orientation data, location data, etc.), and the system 100 can use the sensor data to identify a point in time at which the user experienced an impact.
- the system 100 can analyze the sensor data obtained during the impact, prior to the impact, and/or after the impact to determine whether the user has fallen, and if so, whether the user may be in need of assistance.
- the system 100 can make these determinations based on contextual information, such as the activity that the user was performing at or around the time the user experienced an impact or other force. This be can be beneficial, for example, in improving the accuracy and/or sensitivity by which the system 100 can detect falls.
- the system 100 can determine whether a user has fallen (and whether the user is in need of assistance) using different sets of rules or criteria, depending on the activity that the user was perform at or around the time that she experienced an impact or other force.
- the system 100 can determine that the user was performing a first activity (e.g., walking) and determine whether a user has fallen based on a first set of rules or criteria specific to that first activity.
- the system 100 can determine that the user was performing a second activity (e.g., biking) and determine whether a user has fallen based on a first set of rules or criteria specific to that second activity.
- the system 100 can determine that the user was performing a third activity (e.g., playing basketball) and determine whether a user has fallen based on a first set of rules or criteria specific to that third activity.
- a third activity e.g., playing basketball
- Each set of rules or criteria can be specifically tailored to its corresponding activity, such that false positives and/or false negatives are reduced.
- the system 100 can utilize a first set of rules or criteria by default (e.g., a default set of rules or criteria for determining whether a user has fallen). Upon determining that the user is performing a particular activity, the system 100 can utilize a set of rules or criteria that is specific to that activity. Further, upon determining that the user has ceased performing that activity, the system 100 can revert to the first set of rules or criteria.
- a default set of rules or criteria for determining whether a user has fallen.
- the system 100 can utilize a default set of rules or criteria for detecting whether the user has fallen during frequent day to as activities, such as walking, climbing stairs, etc.
- the system 100 can utilize a specialized set of rules or criteria that are specific to detecting whether the user has fallen while biking.
- the system 100 can utilize another specialized set of rules or criteria that are specific to detecting whether the user has fallen while participating on that activity.
- the system 100 can revert to using the default set of rules or criteria for determining whether the user has fallen.
- the system 100 can determine whether a user has fallen (and whether the user is in need of assistance) using a state machine having several states, where each state corresponds to a different type of activity and a different corresponding set of criteria.
- An example state machine 300 is shown in FIG. 3 .
- the state machine includes three states 302 a - 302 c , each corresponding to a different type of activity, and each being associated with a different set of rules or criteria for determining whether the user has fallen and/or whether the user is in need of assistance.
- first state 302 a can correspond to a default activity. Further, first state 302 a can be associated with a default set of rules or criteria for determining whether a user has fallen and/or whether the user is in need of assistance. In some implementations, the default activity can correspond to one or more of walking, jogging, running, standing, and/or sitting.
- the second state 302 b can correspond to a biking activity. Further, the second state 302 b can be associated with a set of rules or criteria for determining whether a user has fallen and/or whether the user is in need of assistance, specifically in the context of biking.
- the second state 302 c can correspond to an activity in which user commonly experiences large impacts (e.g., volleyball, basketball, etc.). Further, the third state 302 c can be associated with a set of rules or criteria for determining whether a user has fallen and/or whether the user is in need of assistance, specifically in the context of high impact activities.
- large impacts e.g., volleyball, basketball, etc.
- the third state 302 c can be associated with a set of rules or criteria for determining whether a user has fallen and/or whether the user is in need of assistance, specifically in the context of high impact activities.
- the system 100 is initially set to a default state (e.g., the first state 302 a ) and determines whether a user has fallen and/or whether the user is in need of assistance based on the default set of rules or criteria that is associated with that state.
- a default state e.g., the first state 302 a
- the system 100 Upon determining that the user is performing a different activity, the system 100 transitions to the state corresponding to that activity, and determines whether a user has fallen and/or whether the user is in need of assistance based on the set of rules or criteria that is associated with that new state.
- the system 100 can transition from the first state 302 a to the second state 302 b , and can determine whether a user has fallen and/or whether the user is in need of assistance based on the set of rules or criteria that is associated with the second state 302 b.
- the system 100 can transition from the second state 302 b to the third state 302 c , and can determine whether a user has fallen and/or whether the user is in need of assistance based on the set of rules or criteria that is associated with the third state 302 c.
- the system 100 Upon determining that the user is no longer performing an specialized activity (e.g., an activity that is not associated with a state other than the default first state 302 a ), the system 100 transitions back to the default first state 302 a , and determines whether a user has fallen and/or whether the user is in need of assistance based on the default set of rules or criteria that is associated with that state.
- an specialized activity e.g., an activity that is not associated with a state other than the default first state 302 a
- the system 100 transitions back to the default first state 302 a , and determines whether a user has fallen and/or whether the user is in need of assistance based on the default set of rules or criteria that is associated with that state.
- state machine 200 shown in FIG. 2 includes three states, this is merely an illustrative example. In practice, a state machine can include any number of states corresponding to any number of activities (and in turn, any number of different sets of rules or criteria).
- the system 100 can determine the type of activity being performed by a user based on sensor data obtained by the mobile device 102 , such as location data, acceleration data, and/or orientation data
- each type of activity may be identified by detecting certain characteristics or combinations of characteristics the sensor data that are indicative of that type of activity.
- a first type of activity may correspond to sensor data having a first set of characteristics
- a second type of activity may correspond to sensor data having a second set of characteristics
- a third type of activity may correspond to sensor data having a third set of characteristics, and so forth.
- the system 100 can identify type of activity being performed by a user by obtaining sensor data from the mobile device 102 , and determining that the sensor data exhibits a particular set of characteristics.
- the system 100 can determine whether the user is biking based on the distance that a user traveled and/or speed that which the user traveled prior to the impact (e.g., based on output from a location sensor, such as a GPS sensor). For example, a greater distance and/or a higher speed (e.g., greater than certain threshold values) may indicate that the user is biking, whereas a lower distance and/or a lower speed (e.g., less than certain threshold values) may indicate that that the user is walking.
- a location sensor such as a GPS sensor
- the system 100 can determine whether the user is biking based on sensor measurements from an accelerometer and/or orientation sensor (e.g., gyroscope) of the mobile device 102 .
- an accelerometer and/or orientation sensor e.g., gyroscope
- a user might experience certain types of impacts and/or change the orientation of her body (e.g., her wrist) in certain ways while biking, and experience different types of impacts and/or change the orientation of her body in different ways while walking.
- the system 100 can determine whether the user is performing an activity in which user commonly experiences large impacts (e.g., volleyball, basketball, etc.) based on sensor measurements from an accelerometer and/or orientation sensor (e.g., gyroscope) of the mobile device 102 . For example, when a user plays volleyball, a user may commonly move her arm or wrist (to which the mobile device 102 is attached) according to a distinctive pattern. The system 100 can determine, based on the sensor data, whether the user is moving her arm or wrist according to that pattern, and if so, determine that the user is playing volleyball.
- large impacts e.g., volleyball, basketball, etc.
- orientation sensor e.g., gyroscope
- the system 100 can determine whether the user is performing a particular activity based on manual user input. For example, prior to or during the performance of an activity, a user can manually identify that activity to the mobile device 102 and/or system 100 . For example, prior to biking, a user can input data (e.g., to the mobile device 102 ) indicating that she is about to go biking. Based on the user input, the system 100 can determine that the user will be biking. In some implementations, a user can provide input to a mobile device 102 by selecting a particular activity (e.g., from a list or menu on candidate activities). In some implementations, a user can provide input to a mobile device 102 by selecting a particular application or feature of the mobile device 102 that is specific to or otherwise associated with that activity (e.g., an exercise application or feature).
- the system 100 can utilize a context-specific set or rules or criteria for determining whether a user has fallen (and whether the user is in need of assistance) while the user performs certain activities, such as biking.
- the context-specific sets or rules or criteria can pertain to sensor data obtained by the mobile device 102 worn by the user.
- the sets or rules or criteria can pertain to location data obtained by one or more location sensors (e.g., one or more GPS sensors), acceleration data (e.g., impact data) obtained by one or more accelerometers, and/or orientation data obtained by one or more orientation sensors (e.g., gyroscopes, inertial measurement units, etc.).
- location sensors e.g., one or more GPS sensors
- acceleration data e.g., impact data
- orientation sensors e.g., gyroscopes, inertial measurement units, etc.
- a mobile device 102 can be worn by a user on her wrist by biking. Further, the mobile device 102 can obtain sensor data representing the orientation of the mobile device 102 (and correspondingly, the orientation of the user's wrist or arm) and the acceleration experienced by the mobile device (e.g., representing movements of the user's wrist or arm) prior to, during, and after an impact.
- sensor measurements indicating that user has may be changed the orientation of her wrist by a large degree (e.g., greater than a threshold amount) and (ii) moved her wrist or arm by a large degree (e.g., greater than a threshold amount) may be indicative that the user has fallen.
- sensor measurements indicating that user has (i) changed the orientation of her wrist by a small degree (e.g., not greater than a threshold amount) and (ii) moved her wrist or arm by a large degree (e.g., greater than a threshold amount) may be indicative that the user is biking on rough terrain but has not fallen.
- sensor measurements indicating that user has (i) changed the orientation of her wrist by a large degree (e.g., not greater than a threshold amount) and (ii) moved her wrist or arm by a small degree (e.g., not greater than a threshold amount) may be indicative that the user is signaling or performing a gesture, and has not fallen.
- sensor measurements indicating that user has (i) changed the orientation of her wrist by a small degree (e.g., not greater than a threshold amount) and (ii) moved her wrist or arm by a small degree (e.g., not greater than a threshold amount) may be indicative that the user is static and has not fallen.
- sensors measurements indicating that the user may indicate that the user (i) has traveled a large distance (e.g., greater than a threshold distance) prior to an impact, (ii) experienced highly directional impacts over time (e.g., a variation, spread, or range of impact directions that is less than a threshold level), and (iii) rotated her wrist a small amount (e.g., less than a threshold amount) may indicate that the user is biking normally, and has not fallen.
- a large distance e.g., greater than a threshold distance
- experienced highly directional impacts over time e.g., a variation, spread, or range of impact directions that is less than a threshold level
- rotated her wrist a small amount e.g., less than a threshold amount
- sensor measurement indicating that the user (i) has traveled a short distance (e.g., less than a threshold distance) after an impact, (ii) experienced impacts with respect to a wide range of directions over time (e.g., a variation, spread, or range of impact directions that is greater than a threshold level), and (iii) rotated her wrist a large amount (e.g., greater than a threshold amount) may indicate that the user has fallen while biking.
- FIG. 4 A shows sensor data 400 representing the orientation of a mobile device that is worn of a user's wrist while bicycling, measured over a 4 second time window (e.g., extending from two seconds prior to the user experiencing an impact at time 0, until two seconds after the user experiencing the impact).
- the orientation of the mobile device (and in turn, the orientation of the user's hand and/or wrist) is relatively stable during the time prior to the impact.
- the orientation of the mobile device exhibits a large angular change over a short time interval (e.g., approximately 0.1 second). Further, the orientation of the mobile device exhibits a large angular change over the entire time window.
- a system 100 can determine that the user has fallen from her bicycle if (i) the angular change in the orientation of the mobile device over the time window (e.g., a 4 second window) is greater than a first threshold amount ⁇ 1 , and (ii) the angular change in the orientation of the mobile device over a subset of that time window (e.g., a 0.1 second subset of the 4 second time window) is greater than a second threshold amount ⁇ 2 . Otherwise, the system 100 can determine that the user has not fallen from her bicycle.
- the time window e.g., a 4 second window
- a subset of that time window e.g., a 0.1 second subset of the 4 second time window
- FIG. 4 B shows additional sensor data 450 representing the orientation of a mobile device that is worn of a user's wrist while bicycling, measured over a 4 second time window (e.g., extending from two seconds prior to the user experiencing an impact at time 0, until two seconds after the user experiencing the impact).
- a 4 second time window e.g., extending from two seconds prior to the user experiencing an impact at time 0, until two seconds after the user experiencing the impact.
- the orientation of the mobile device is relatively stable during the entirety time window.
- a system 100 can determine that the user has not fallen from her bicycle if (i) the angular change in the orientation of the mobile device over the time window (e.g., a 4 second window) is not greater than a first threshold amount ⁇ 1 , and/or (ii) the angular change in the orientation of the mobile device over a subset of that time window (e.g., a 0.1 second subset of the 4 second time window) is not greater than a second threshold amount ⁇ 2 .
- the time window e.g., a 4 second window
- a subset of that time window e.g., a 0.1 second subset of the 4 second time window
- the time window, the subset of the time window, and the threshold amounts can differ, depending on the implementation.
- the time window, the subset of the time window, and the threshold amounts can be tunable values that are selected based on experimental studies of the characteristics of users' movements while riding bicycles.
- a system 100 can determine that a user has fallen by biking upon receiving sensor measurements indicating that the user (i) experienced vibrations that are characteristic of bicycling prior to the impact, and (ii) has not experienced vibrations that are characteristic of bicycling within a particular time interval after the impact (e.g., within a threshold time interval T).
- the system 100 determine that a user has not fallen upon receiving sensor measurements indicating that the user (i) experienced vibrations that are characteristic of bicycling prior to the impact, and (ii) has again experienced vibrations that are characteristic of bicycling within the particular time interval after the impact (e.g., within the threshold time interval T).
- a user may orient her wrist differently, depending on the configuration of her bicycle's handlebars.
- the system 100 can infer the configuration of the handlebars, and apply different sets of rules or criteria for each configuration.
- FIG. 5 shows an example bicycle 502 having horizontal (or approximately horizontal) handlebars 504 .
- the user 110 is wearing the mobile device 102 on one of her wrists, and is grasping the handlebars 504 with her hands.
- the x-axis and y-axis of the mobile device 102 are shown extending from the mobile device 102 .
- the y-direction extends along (or approximately along) the handlebars 504
- the x-direction extends along (or approximately along) the user's arm
- the z-direction (not shown) extends perpendicular to a face of the mobile device 102 ).
- Sensor measurements indicating that the user experienced a high intensity impact (e.g., greater than a threshold level) in a Y-direction may indicate that the user has fallen while biking.
- sensor measurements indicating that the user experienced a low intensity impact (e.g., less than the threshold level) in the Y-direction may indicate that the user is biking normally, and has not fallen.
- FIG. 6 A shows sensor data 600 representing the acceleration of a mobile device that is worn of a user's wrist while bicycling, measured in the x-direction and the y-direction over a 1.2 second time window (e.g., extending from 0.6 seconds prior to the user experiencing an impact at time 0, until 0.6 seconds after the user experiencing the impact).
- the mobile device and in turn, the user
- experienced a high intensity impact in both the x-direction and y-direction e.g., above a threshold intensity level, which may be characteristic of the user falling.
- FIG. 6 B shows sensor data 620 representing the acceleration of a mobile device that is worn of a user's wrist while bicycling, measured in the x-direction and y-direction over a 1.2 second time window (e.g., extending from 0.6 seconds prior to the user experiencing an impact at time 0, until 0.6 seconds after the user experiencing the impact).
- the mobile device experienced a high intensity impact in the x-direction (e.g., in the direction along the user's arm).
- the mobile device (and in turn, the user) did not experience a high intensity impact in the y-direction (e.g., in a direction along the handlebars). This may be indicative of the user not falling.
- a system 100 can determine that the user has fallen from her bicycle if (i) the intensity of the impact experienced in a x-direction is greater than a first threshold amount I 1 , and (ii) the intensity of the impact experienced in a y-direction is greater than a second threshold amount I 2 . Otherwise, the system 100 can determine that the user not fallen.
- the threshold amounts can differ, depending on the implementation. For example, the threshold amounts can be tunable values that are selected based on experimental studies of the characteristics of users' movements while riding bicycles.
- FIG. 7 shows another example bicycle 702 having vertical (or approximately vertical) handlebars 704 .
- the user 110 is wearing the mobile device 102 on one of her wrists, and is grasping the handlebars 454 with her hands.
- the x-axis and y-axis of the mobile device 102 are shown extending from the mobile device 102 .
- the y-direction extends along (or approximately along) the handlebars 704
- the x-direction extends along (or approximately along) the user's arm
- the z-direction (not shown) extends perpendicular to a face of the mobile device 102 ).
- Sensor measurements indicating that the user (i) has moved her hand chaotically, (ii) experienced a high intensity impact (e.g., greater than a first threshold level I 1 ) in a Y-direction, and (iii) a high intensity impact (e.g., greater than a second threshold level I 2 ) in a Z-direction may indicate that the user has fallen while biking.
- sensor measurement indicating that the user (i) has maintained her hand is a stable vertical direction, (ii) experienced a high intensity impact (e.g., greater than the first threshold level I 1 ) in a Y-direction, and (iii) a low intensity impact (e.g., lower than the second threshold level I 2 ) in a Z-direction may indicate that the user is biking normally, and has not fallen.
- a high intensity impact e.g., greater than the first threshold level I 1
- a low intensity impact e.g., lower than the second threshold level I 2
- a system 100 can determine that a user has fallen by biking upon receiving sensor measurements indicating that (i) the variation, spread, or range of the directions of orientation of the mobile device 102 is greater than a threshold level (e.g., indicative of chaotic movement by the user), (ii) the mobile device experienced a high intensity impact (e.g., greater than the threshold level I 1 ) in a Y-direction, and (iii) the mobile device experience a high intensity impact (e.g., greater than the second threshold level I 2 ) in a Z-direction.
- a threshold level e.g., indicative of chaotic movement by the user
- a system 100 can determine that user has maintained her hand is a stable vertical direction by determining that (i) the variation, spread, or range of the orientation of the mobile device 102 is not greater than a threshold level, and (ii) the angle between the y-direction of the mobile device 102 and the vertical direction is less than a threshold angle ⁇ T . Further, upon additionally determining that (i) the mobile device experienced a high intensity impact (e.g., greater than the threshold level I 1 ) in a Y-direction, and (iii) the mobile device experience a low intensity impact (e.g., not greater than the second threshold level I 2 ) in a Z-direction, the system 100 can determine that this user has not fallen while biking.
- a high intensity impact e.g., greater than the threshold level I 1
- a low intensity impact e.g., not greater than the second threshold level I 2
- the mobile device 102 can generate and transmit a notification to one or more communications devices 106 to notify one or more users 112 (e.g., caretakers, physicians, medical responders, emergency contact persons, etc.) of the situation, such that they can take action.
- notification can be generated and transmitted upon the satisfaction of certain criteria in order to reduce the occurrence of false positives.
- FIG. 8 shows an example process 800 for generating and transmitting a notification in response to a user falling.
- a system determines whether a user was biking before experiencing an impact (block 802 ).
- the system can make this determination based on sensor data obtained by a mobile device worn by the user (e.g., as described above).
- the system can detect whether a user has fallen using a default technique (block 850 ). For example, referring to FIG. 3 , the system can detect whether a user has fallen according to a default set of rules of criteria that are not specific to biking.
- the system determines whether the impact has characteristics off a biking fall (block 802 ).
- the system can make this determination based on sensor data obtained by a mobile device worn by the user (e.g., as described above).
- the system determines that the impact does not have characteristics off a biking fall, the system refrains from generating and transmitting a notification (block 812 ).
- the system determines whether the user has stopped biking after the impact (block 806 ). The system can make this determination based on sensor data obtained by a mobile device worn by the user (e.g., as described above).
- the system determines that the user has not stopped biking after the impact, the system refrains from generating and transmitting a notification (block 812 ).
- the system determines whether the user has remained sufficiently still for a period of time (e.g., a one minute time interval) after the impact (block 808 ).
- the system can make this determination based on sensor data obtained by a mobile device worn by the user (e.g., by determining whether the mobile device has moved more than a threshold distance, changed its orientation by more than a threshold angle, moved for a length of time greater than a threshold amount of time, etc.).
- the system determines that the user has not remained sufficiently still for the period of time, the system refrains from generating and transmitting a notification (block 812 ).
- the system determines that the user has remained sufficiently still for the period of time, the system generates and transmit a notification (block 810 ).
- the mobile device 102 can determine whether a user remains immobile after the fall for a particular time interval (e.g., 30 seconds). Upon determining that user has remained immobile, the mobile device 102 present an alert notification to the user, including an option to generate and transmit a notification (e.g., an emergency responder) and an option to refrain from generating and training a notification. An example of this alert notification is shown in FIG. 9 A .
- a notification e.g., an emergency responder
- the mobile device 102 can present an alert notification to the user showing a count down, and indicating that a notification will be generated and transmitted upon expiration of the count down, absent input otherwise by the user.
- An example of this alert notification is shown in FIG. 9 B .
- the mobile device 102 Upon expiration of the count down without input from the user, the mobile device 102 generates and transmits a notification (e.g., as shown in FIG. 9 C ).
- This technique can be beneficial, for example, in further reducing the occurrence of false positives and reducing the likelihood that notifications are transmitted to others (e.g., emergency services) in error when the user does not actually require assistance.
- others e.g., emergency services
- FIG. 1000 An example process 1000 for determining whether a user has fallen and/or may be in need of assistance using a mobile device is shown in FIG. 1000 .
- the process 1000 can be performed for example, using the mobile device 102 and/or the system 100 shown in FIGS. 1 and 2 .
- some or all of the process 1000 can be performed by a co-processor of the mobile device.
- the co-processor can be configured to receive motion data obtained from one or more sensors, process the motion data, and provide the processed motion data to one or more processors of the mobile device.
- a mobile device receives sensor data obtained by one or more sensor over a time period (block 1002 ).
- the one or more sensors are worn by a user.
- the mobile device can be a wearable mobile device, such as a smart watch.
- At least some of the one or more sensors can be disposed on or in the mobile device. In some implementations, at least some of the one or more sensors are remote from the mobile device.
- the mobile device can be a smart phone, and the sensors can be disposed on a smart watch that is communicatively coupled to the smart phone.
- the sensor data can include one or more types of data.
- the sensor data can include location data obtained by one or more location sensors of the mobile device.
- the sensor data can include acceleration data obtained by one or more acceleration sensors of the mobile device.
- the sensor data can include orientation data obtained by one or more orientation sensors of the mobile device.
- the mobile device determines a context of the user based on the sensor data (block 1004 ).
- the context can correspond to a type of activity performed by the user during the time period.
- Example contexts include bicycling, walking, running, jigging, playing a sport (e.g., basketball, volley, etc.), or any other activity that may be performed by a user.
- the mobile device obtains a set of rules for processing the sensor data based on the context (block 1006 ).
- the set of rules is specific to the context.
- the mobile device determines a likelihood that the user has fallen and/or a likelihood that the user requires assistance based on the sensor data and the set of rules (block 1008 ).
- the mobile device determines a likelihood that the user has fallen and/or a likelihood that the user requires assistance using sets of rules that are specific to the context.
- sets of rules for a bicycling context are described above.
- determining the likelihood that the user has fallen and/or the likelihood that the user requires assistance can include (i) determining, based on the sensor data, that a distance traveled by the user prior over the period of time is greater than a first threshold value, and (ii) determining, based on the sensor data, that a variation in a direction of impacts experienced by the user over the period of time is less than a second threshold value, (iii) determining, based on the sensor data, that a rotation of the user's wrist over the period of time is less than a third threshold value, and (iv) determining that the user has fallen and/or requires assistance based on the determination that the distance traveled by the user prior over the period of time is greater than the first threshold value, the determination that the variation in a direction of impacts experienced by the user over the period of time is less than the second threshold value, and the determination that the rotation of the user's wrist over the period of time is less than the third threshold value.
- determining the likelihood that the user has fallen and/or the likelihood that the user requires assistance can include (i) determining, based on the sensor data, that a magnitude of an impact experienced by the user over the period of time in a first direction is greater than a first threshold value, and (ii) determining that the user has fallen and/or requires assistance based on the determination that the magnitude of the impact experienced by the user over the period of time in the first direction is greater than the first threshold value.
- determining the likelihood that the user has fallen and/or the likelihood that the user requires assistance can include (i) determining, based on the sensor data, that a change in an orientation of the user's hand over the period of time is greater than a first threshold value, (ii) determining, based on the sensor data, that a magnitude of an impact experienced by the user over the period of time in a first direction is greater than a second threshold value, wherein the first direction is orthogonal to the second threshold value, (iii) determining, based on the sensor data, that a magnitude of an impact experienced by the user over the period of time in a second direction is greater than a third threshold value, and (iv) determining that the user has fallen and/or requires assistance based on the determination that the change in an orientation of the user's hand over the period of time is greater than the first threshold value, the determination that the magnitude of the impact experienced by the user over the period of time in the first direction is greater than the second threshold value, and the determination that the magnitude of an impact experienced
- the mobile device generates one or more notifications based on the likelihood that the user has fallen and/or the likelihood that the user requires assistance (block 1010 ).
- generating the one or more notifications can include transmitting a first notification to a communications device remote from the mobile device.
- the first notification can include an indication that the user has fallen and/or an indication that the user requires assistance.
- the communications device can be an emergency response system.
- the mobile device can perform at least a portion of the process 1000 according to a different context of the user. For example, the mobile device can receive second sensor data obtained by the one or more sensor over a second time period. Further, the mobile device can determine a second context of the user based on the second sensor data, and obtain a second set of rules for processing the sensor data based on the second context, where the second set of rules is specific to the second context. Further, the mobile device can determine at least one of a likelihood that the user has fallen and/or a likelihood that the user requires assistance based on the sensor data and the second set of rules. Further, the mobile device can generate one or more second notifications based on at least one of the likelihood that the user has fallen or the likelihood that the user requires assistance.
- FIG. 11 is a block diagram of an example device architecture 1100 for implementing the features and processes described in reference to FIGS. 1 - 10 .
- the architecture 1100 can be used to implement the mobile device 102 , the server computer system 104 , and/or one or more of the communications devices 106 .
- Architecture 1100 may be implemented in any device for generating the features described in reference to FIGS. 1 - 10 , including but not limited to desktop computers, server computers, portable computers, smart phones, tablet computers, game consoles, wearable computers, set top boxes, media players, smart TVs, and the like.
- the architecture 1100 can include a memory interface 1102 , one or more data processor 1104 , one or more data co-processors 1174 , and a peripherals interface 1106 .
- the memory interface 1102 , the processor(s) 1104 , the co-processor(s) 1174 , and/or the peripherals interface 1106 can be separate components or can be integrated in one or more integrated circuits.
- One or more communication buses or signal lines may couple the various components.
- the processor(s) 1104 and/or the co-processor(s) 1174 can operate in conjunction to perform the operations described herein.
- the processor(s) 1104 can include one or more central processing units (CPUs) that are configured to function as the primary computer processors for the architecture 1100 .
- the processor(s) 1104 can be configured to perform generalized data processing tasks of the architecture 1100 .
- at least some of the data processing tasks can be offloaded to the co-processor(s) 1174 .
- specialized data processing tasks such as processing motion data, processing image data, encrypting data, and/or performing certain types of arithmetic operations, can be offloaded to one or more specialized co-processor(s) 1174 for handling those tasks.
- the processor(s) 1104 can be relatively more powerful than the co-processor(s) 1174 and/or can consume more power than the co-processor(s) 1174 . This can be useful, for example, as it enables the processor(s) 1104 to handle generalized tasks quickly, while also offloading certain other tasks to co-processor(s) 1174 that may perform those tasks more efficiency and/or more effectively.
- a co-processor(s) can include one or more sensors or other components (e.g., as described herein), and can be configured to process data obtained using those sensors or components, and provide the processed data to the processor(s) 1104 for further analysis.
- Sensors, devices, and subsystems can be coupled to peripherals interface 1106 to facilitate multiple functionalities.
- a motion sensor 1110 , a light sensor 1112 , and a proximity sensor 1114 can be coupled to the peripherals interface 1106 to facilitate orientation, lighting, and proximity functions of the architecture 1100 .
- a light sensor 1112 can be utilized to facilitate adjusting the brightness of a touch surface 1146 .
- a motion sensor 1110 can be utilized to detect movement and orientation of the device.
- the motion sensor 1110 can include one or more accelerometers (e.g., to measure the acceleration experienced by the motion sensor 1110 and/or the architecture 1100 over a period of time), and/or one or more compasses or gyros (e.g., to measure the orientation of the motion sensor 1110 and/or the mobile device).
- the measurement information obtained by the motion sensor 1110 can be in the form of one or more a time-varying signals (e.g., a time-varying plot of an acceleration and/or an orientation over a period of time).
- display objects or media may be presented according to a detected orientation (e.g., according to a “portrait” orientation or a “landscape” orientation).
- a motion sensor 1110 can be directly integrated into a co-processor 1174 configured to processes measurements obtained by the motion sensor 1110 .
- a co-processor 1174 can include one more accelerometers, compasses, and/or gyroscopes, and can be configured to obtain sensor data from each of these sensors, process the sensor data, and transmit the processed data to the processor(s) 1104 for further analysis.
- the architecture 1100 can include a heart rate sensor 11112 that measures the beats of a user's heart.
- these other sensors also can be directly integrated into one or more co-processor(s) 1174 configured to process measurements obtained from those sensors.
- a location processor 1115 e.g., a GNSS receiver chip
- An electronic magnetometer 1116 e.g., an integrated circuit chip
- the electronic magnetometer 1116 can be used as an electronic compass.
- a camera subsystem 1120 and an optical sensor 1122 can be utilized to facilitate camera functions, such as recording photographs and video clips.
- an optical sensor 1122 e.g., a charged coupled device [CCD] or a complementary metal-oxide semiconductor [CMOS] optical sensor
- CCD charged coupled device
- CMOS complementary metal-oxide semiconductor
- the communication subsystem(s) 1124 can include one or more wireless and/or wired communication subsystems.
- wireless communication subsystems can include radio frequency receivers and transmitters and/or optical (e.g., infrared) receivers and transmitters.
- wired communication system can include a port device, e.g., a Universal Serial Bus (USB) port or some other wired port connection that can be used to establish a wired connection to other computing devices, such as other communication devices, network access devices, a personal computer, a printer, a display screen, or other processing devices capable of receiving or transmitting data.
- USB Universal Serial Bus
- the architecture 1100 can include wireless communication subsystems designed to operate over a global system for mobile communications (GSM) network, a GPRS network, an enhanced data GSM environment (EDGE) network, 802.x communication networks (e.g., Wi-Fi, Wi-Max), code division multiple access (CDMA) networks, NFC and a BluetoothTM network.
- GSM global system for mobile communications
- EDGE enhanced data GSM environment
- 802.x communication networks e.g., Wi-Fi, Wi-Max
- CDMA code division multiple access
- NFC wireless standardized as BluetoothTM
- the wireless communication subsystems can also include hosting protocols such that the architecture 1100 can be configured as a base station for other wireless devices.
- the communication subsystems may allow the architecture 1100 to synchronize with a host device using one or more protocols, such as, for example, the TCP/IP protocol, HTTP protocol, UDP protocol, and any other known protocol.
- An audio subsystem 1126 can be coupled to a speaker 1128 and one or more microphones 1130 to facilitate voice-enabled functions, such as voice recognition, voice replication, digital recording, and telephony functions.
- An I/O subsystem 1140 can include a touch controller 1142 and/or other input controller(s) 1144 .
- the touch controller 1142 can be coupled to a touch surface 1146 .
- the touch surface 1146 and the touch controller 1142 can, for example, detect contact and movement or break thereof using any of a number of touch sensitivity technologies, including but not limited to capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with the touch surface 1146 .
- the touch surface 1146 can display virtual or soft buttons and a virtual keyboard, which can be used as an input/output device by the user.
- Other input controller(s) 1144 can be coupled to other input/control devices 1148 , such as one or more buttons, rocker switches, thumb-wheel, infrared port, USB port, and/or a pointer device such as a stylus.
- the one or more buttons can include an up/down button for volume control of the speaker 1128 and/or the microphone 11110 .
- the architecture 1100 can present recorded audio and/or video files, such as MP3, AAC, and MPEG video files.
- the architecture 1100 can include the functionality of an MP3 player and may include a pin connector for tethering to other devices. Other input/output and control devices may be used.
- a memory interface 1102 can be coupled to a memory 1150 .
- the memory 1150 can include high-speed random access memory or non-volatile memory, such as one or more magnetic disk storage devices, one or more optical storage devices, or flash memory (e.g., NAND, NOR).
- the memory 1150 can store an operating system 1152 , such as Darwin, RTXC, LINUX, UNIX, OS X, WINDOWS, or an embedded operating system such as VxWorks.
- the operating system 1152 can include instructions for handling basic system services and for performing hardware dependent tasks.
- the operating system 1152 can include a kernel (e.g., UNIX kernel).
- the memory 1150 can also store communication instructions 1154 to facilitate communicating with one or more additional devices, one or more computers or servers, including peer-to-peer communications.
- the communication instructions 1154 can also be used to select an operational mode or communication medium for use by the device, based on a geographic location (obtained by the GPS/Navigation instructions 1168 ) of the device.
- the memory 1150 can include graphical user interface instructions 1156 to facilitate graphic user interface processing, including a touch model for interpreting touch inputs and gestures; sensor processing instructions 1158 to facilitate sensor-related processing and functions; phone instructions 1160 to facilitate phone-related processes and functions; electronic messaging instructions 1162 to facilitate electronic-messaging related processes and functions; web browsing instructions 1164 to facilitate web browsing-related processes and functions; media processing instructions 1166 to facilitate media processing-related processes and functions; GPS/Navigation instructions 1169 to facilitate GPS and navigation-related processes; camera instructions 1170 to facilitate camera-related processes and functions; and other instructions 1172 for performing some or all of the processes described herein.
- graphical user interface instructions 1156 to facilitate graphic user interface processing, including a touch model for interpreting touch inputs and gestures
- sensor processing instructions 1158 to facilitate sensor-related processing and functions
- phone instructions 1160 to facilitate phone-related processes and functions
- electronic messaging instructions 1162 to facilitate electronic-messaging related processes and functions
- web browsing instructions 1164 to facilitate web browsing-related processes and functions
- Each of the above identified instructions and applications can correspond to a set of instructions for performing one or more functions described herein. These instructions need not be implemented as separate software programs, procedures, or modules.
- the memory 1150 can include additional instructions or fewer instructions.
- various functions of the device may be implemented in hardware and/or in software, including in one or more signal processing and/or application specific integrated circuits (ASICs).
- ASICs application specific integrated circuits
- the features described may be implemented in digital electronic circuitry or in computer hardware, firmware, software, or in combinations of them.
- the features may be implemented in a computer program product tangibly embodied in an information carrier, e.g., in a machine-readable storage device, for execution by a programmable processor; and method steps may be performed by a programmable processor executing a program of instructions to perform functions of the described implementations by operating on input data and generating output.
- the described features may be implemented advantageously in one or more computer programs that are executable on a programmable system including at least one programmable processor coupled to receive data and instructions from, and to transmit data and instructions to, a data storage system, at least one input device, and at least one output device.
- a computer program is a set of instructions that may be used, directly or indirectly, in a computer to perform a certain activity or bring about a certain result.
- a computer program may be written in any form of programming language (e.g., Objective-C, Java), including compiled or interpreted languages, and it may be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment.
- Suitable processors for the execution of a program of instructions include, by way of example, both general and special purpose microprocessors, and the sole processor or one of multiple processors or cores, of any kind of computer.
- a processor will receive instructions and data from a read-only memory or a random access memory or both.
- the essential elements of a computer are a processor for executing instructions and one or more memories for storing instructions and data.
- a computer may communicate with mass storage devices for storing data files. These mass storage devices may include magnetic disks, such as internal hard disks and removable disks; magneto-optical disks; and optical disks.
- Storage devices suitable for tangibly embodying computer program instructions and data include all forms of non-volatile memory, including by way of example semiconductor memory devices, such as EPROM, EEPROM, and flash memory devices; magnetic disks such as internal hard disks and removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks.
- semiconductor memory devices such as EPROM, EEPROM, and flash memory devices
- magnetic disks such as internal hard disks and removable disks
- magneto-optical disks and CD-ROM and DVD-ROM disks.
- the processor and the memory may be supplemented by, or incorporated in, ASICs (application-specific integrated circuits).
- ASICs application-specific integrated circuits
- the features may be implemented on a computer having a display device such as a CRT (cathode ray tube) or LCD (liquid crystal display) monitor for displaying information to the author and a keyboard and a pointing device such as a mouse or a trackball by which the author may provide input to the computer.
- a display device such as a CRT (cathode ray tube) or LCD (liquid crystal display) monitor for displaying information to the author and a keyboard and a pointing device such as a mouse or a trackball by which the author may provide input to the computer.
- the features may be implemented in a computer system that includes a back-end component, such as a data server or that includes a middleware component, such as an application server or an Internet server, or that includes a front-end component, such as a client computer having a graphical user interface or an Internet browser, or any combination of them.
- the components of the system may be connected by any form or medium of digital data communication such as a communication network. Examples of communication networks include a LAN, a WAN and the computers and networks forming the Internet.
- the computer system may include clients and servers.
- a client and server are generally remote from each other and typically interact through a network.
- the relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
- An API may define on or more parameters that are passed between a calling application and other software code (e.g., an operating system, library routine, function) that provides a service, that provides data, or that performs an operation or a computation.
- software code e.g., an operating system, library routine, function
- the API may be implemented as one or more calls in program code that send or receive one or more parameters through a parameter list or other structure based on a call convention defined in an API specification document.
- a parameter may be a constant, a key, a data structure, an object, an object class, a variable, a data type, a pointer, an array, a list, or another call.
- API calls and parameters may be implemented in any programming language.
- the programming language may define the vocabulary and calling convention that a programmer will employ to access functions supporting the API.
- an API call may report to an application the capabilities of a device running the application, such as input capability, output capability, processing capability, power capability, communications capability, etc.
- this gathered data may identify a particular location or an address based on device usage.
- personal information data can include location-based data, addresses, subscriber account identifiers, or other identifying information.
- the present disclosure further contemplates that the entities responsible for the collection, analysis, disclosure, transfer, storage, or other use of such personal information data will comply with well-established privacy policies and/or privacy practices.
- such entities should implement and consistently use privacy policies and practices that are generally recognized as meeting or exceeding industry or governmental requirements for maintaining personal information data private and secure.
- personal information from users should be collected for legitimate and reasonable uses of the entity and not shared or sold outside of those legitimate uses. Further, such collection should occur only after receiving the informed consent of the users.
- such entities would take any needed steps for safeguarding and securing access to such personal information data and ensuring that others with access to the personal information data adhere to their privacy policies and procedures. Further, such entities can subject themselves to evaluation by third parties to certify their adherence to widely accepted privacy policies and practices.
- the present disclosure also contemplates embodiments in which users selectively block the use of, or access to, personal information data. That is, the present disclosure contemplates that hardware and/or software elements can be provided to prevent or block access to such personal information data.
- the present technology can be configured to allow users to select to “opt in” or “opt out” of participation in the collection of personal information data during registration for services.
- the present disclosure broadly covers use of personal information data to implement one or more various disclosed embodiments, the present disclosure also contemplates that the various embodiments can also be implemented without the need for accessing such personal information data. That is, the various embodiments of the present technology are not rendered inoperable due to the lack of all or a portion of such personal information data.
- content can be selected and delivered to users by inferring preferences based on non-personal information data or a bare minimum amount of personal information, such as the content being requested by the device associated with a user, other non-personal information available to the content delivery services, or publicly available information.
Landscapes
- Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- Emergency Management (AREA)
- Business, Economics & Management (AREA)
- General Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Gerontology & Geriatric Medicine (AREA)
- Life Sciences & Earth Sciences (AREA)
- Psychiatry (AREA)
- Psychology (AREA)
- Social Psychology (AREA)
- Engineering & Computer Science (AREA)
- Computer Security & Cryptography (AREA)
- Biophysics (AREA)
- Surgery (AREA)
- Veterinary Medicine (AREA)
- Public Health (AREA)
- Pathology (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
- Physiology (AREA)
- Telephone Function (AREA)
- Dentistry (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Computer Networks & Wireless Communication (AREA)
- Critical Care (AREA)
- Emergency Medicine (AREA)
- Nursing (AREA)
Abstract
Description
- This application claims priority to U.S. Provisional Patent Application No. 63/242,998, filed Sep. 10, 2021, the entire contents of which are incorporated herein by reference.
- The disclosure relates to systems and methods for determining whether a user has fallen using a mobile device.
- A motion sensor is a device that measures the motion experienced by an object (e.g., the velocity or acceleration of the object with respect to time, the orientation or change in orientation of the object with respect to time, etc.). In some cases, a mobile device (e.g., a cellular phone, a smart phone, a tablet computer, a wearable electronic device such as a smart watch, etc.) can include one or more motion sensors that determine the motion experienced by the mobile device over a period of time. If the mobile device is worn by a user, the measurements obtained by the motion sensor can be used to determine the motion experienced by the user over the period of time.
- Systems, methods, devices and non-transitory, computer-readable media are disclosed for electronically determining whether a user has fallen using a mobile device.
- In an aspect, a method includes: receiving, by a mobile device, sensor data obtained by one or more sensor over a time period, where the one or more sensors are worn by a user; determining, by the mobile device, a context of the user based on the sensor data; obtaining, by the mobile device based on the context, a set of rules for processing the sensor data, where the set of rules is specific to the context; determining, by the mobile device, at least one of a likelihood that the user has fallen or a likelihood that the user requires assistance based on the sensor data and the set of rules; and generating, by the mobile device, one or more notifications based on at least one of the likelihood that the user has fallen or the likelihood that the user requires assistance.
- Implementations of this aspect can include one or more of the following features.
- In some implementations, the sensor data can include location data obtained by one or more location sensors of the mobile device.
- In some implementations, the sensor data can include acceleration data obtained by one or more acceleration sensors of the mobile device.
- In some implementations, the sensor data can include orientation data obtained by one or more orientation sensors of the mobile device.
- In some implementations, the context can correspond to the user bicycling during the time period.
- In some implementations, determining the likelihood that the user has fallen and/or the likelihood that the user requires assistance can include: determining, based on the sensor data, that a distance traveled by the user prior over the period of time is greater than a first threshold value; determining, based on the sensor data, that a variation in a direction of impacts experienced by the user over the period of time is less than a second threshold value; determining, based on the sensor data, that a rotation of the user's wrist over the period of time is less than a third threshold value; and determining that the user has fallen and/or requires assistance based on the determination that the distance traveled by the user prior over the period of time is greater than the first threshold value, the determination that the variation in a direction of impacts experienced by the user over the period of time is less than the second threshold value, and the determination that the rotation of the user's wrist over the period of time is less than the third threshold value.
- In some implementations, determining the likelihood that the user has fallen and/or the likelihood that the user requires assistance can include: determining, based on the sensor data, that a magnitude of an impact experienced by the user over the period of time in a first direction is greater than a first threshold value; and determining that the user has fallen and/or requires assistance based on the determination that the magnitude of the impact experienced by the user over the period of time in the first direction is greater than the first threshold value.
- In some implementations, determining the likelihood that the user has fallen and/or the likelihood that the user requires assistance can include: determining, based on the sensor data, that a change in an orientation of the user's hand over the period of time is greater than a first threshold value; determining, based on the sensor data, that a magnitude of an impact experienced by the user over the period of time in a first direction is greater than a second threshold value, where the first direction is orthogonal to the second threshold value; determining, based on the sensor data, that a magnitude of an impact experienced by the user over the period of time in a second direction is greater than a third threshold value; and determining that the user has fallen and/or requires assistance based on the determination that the change in an orientation of the user's hand over the period of time is greater than the first threshold value, the determination that the magnitude of the impact experienced by the user over the period of time in the first direction is greater than the second threshold value, and the determination that the magnitude of an impact experienced by the user over the period of time in the second direction is greater than the third threshold value.
- In some implementations, the method can further include: receiving, by the mobile device, second sensor data obtained by the one or more sensor over a second time period; determining, by the mobile device, a second context of the user based on the second sensor data; obtaining, by the mobile device based on the second context, a second set of rules for processing the sensor data, where the second set of rules is specific to the second context; determining, by the mobile device, at least one of a likelihood that the user has fallen and/or a likelihood that the user requires assistance based on the sensor data and the second set of rules; and generating, by the mobile device, one or more second notifications based on at least one of the likelihood that the user has fallen or the likelihood that the user requires assistance.
- In some implementations, the second context can correspond to the user walking during the second time period.
- In some implementations, the second context can correspond to the user playing at least one of basketball or volleyball during the second time period.
- In some implementations, generating the one or more notifications can include: transmitting a first notification to a communications device remote from the mobile device, the first notification including an indication that the user has fallen.
- In some implementations, the communications device can be an emergency response system.
- In some implementations, the mobile device can be a wearable mobile device.
- In some implementations, at least some of the one or more sensors can be disposed on or in the mobile device.
- In some implementations, at least some of the one or more sensors can be remote from the mobile device.
- Other implementations are directed to systems, devices and non-transitory, computer-readable mediums including computer-executable instructions for performing the techniques described herein.
- The details of one or more embodiments are set forth in the accompanying drawings and the description below. Other features and advantages will be apparent from the description and drawings, and from the claims.
-
FIG. 1 is a diagram of an example system for determining whether a user has fallen and/or may be in need of assistance. -
FIG. 2A is a diagram showing an example position of a mobile device on a user's body. -
FIG. 2B is a diagram showing example directional axes with respect a mobile device. -
FIG. 3 is a diagram of an example state machine for determining whether a user has fallen and/or requires assistance -
FIGS. 4A and 4B are diagrams of example sensor data obtained by a mobile device. -
FIG. 5 is a diagram of an example bicycle and a user wearing a mobile device. -
FIGS. 6A and 6B are diagrams of additional example sensor data obtained by a mobile device. -
FIG. 7 is a diagram of another example bicycle and a user wearing a mobile device. -
FIG. 8 is a flow char diagram of an example process for generating and transmitting notifications. -
FIGS. 9A-9C are diagrams of example alert notification generated by a mobile device. -
FIG. 10 is a flow chart diagram of an example process for determining whether a user has fallen and/or requires assistance. -
FIG. 11 is a block diagram of an example architecture for implementing the features and processes described in reference toFIGS. 1-11 . -
FIG. 1 shows anexample system 100 for determining whether a user has fallen and/or may be in need of assistance. Thesystem 100 includes amobile device 102, aserver computer system 104,communications devices 106, and anetwork 108. - The implementations described herein enable the
system 100 to determine whether a user has fallen and/or whether the user may be in need of assistance more accurately, such that resources can be more effectively used. For instance, thesystem 100 can determine whether the user has fallen and/or whether the user may be in need of assistance with fewer false positives. Thus, thesystem 100 is less likely to consume computational and/or network resources to generate and transmit notifications to others when the user does not need assistance. Further, medical and logistical resources can be deployed to assist a user with a greater degree of confidence that they are needed, thereby reducing the likelihood of waste. Accordingly, resources can be consumed more efficiently, and in a manner that increases the effective response capacity of one or more systems (e.g., a computer system, a communications system, and/or an emergency response system). - The
mobile device 102 can be any portable electronic device for receiving, processing, and/or transmitting data, including but not limited to cellular phones, smart phones, tablet computers, wearable computers (e.g., smart watches), and the like. Themobile device 102 is communicatively connected toserver computer system 104 and/or thecommunications devices 106 using thenetwork 108. - The
server computer system 104 is communicatively connected tomobile device 102 and/or thecommunications devices 106 using thenetwork 108. Theserver computer system 104 is illustrated as a respective single component. However, in practice, it can be implemented on one or more computing devices (e.g., each computing device including at least one processor such as a microprocessor or microcontroller). Aserver computer system 104 can be, for instance, a single computing device that is connected to thenetwork 108. In some implementations, theserver computer system 104 can include multiple computing devices that are connected to thenetwork 108. In some implementations, theserver computer system 104 need not be located locally to the rest of thesystem 100, and portions of aserver computer system 104 can be located in one or more remote physical locations. - A
communications device 106 can be any device that is used to transmit and/or receive information transmitted across thenetwork 108. Examples of thecommunications devices 106 include computers (such as desktop computers, notebook computers, server systems, etc.), mobile devices (such as cellular phones, smartphones, tablets, personal data assistants, notebook computers with networking capability), telephones, faxes, and other devices capable of transmitting and receiving data from thenetwork 108. Thecommunications devices 106 can include devices that operate using one or more operating system (e.g., Apple iOS, Apple watchOS, Apple macOS, Microsoft Windows, Linux, Unix, Android, etc.) and/or architectures (e.g., x86, PowerPC, ARM, etc.) In some implementations, one or more of thecommunications devices 106 need not be located locally with respect to the rest of thesystem 100, and one or more of thecommunications devices 106 can be located in one or more remote physical locations. - The
network 108 can be any communications network through which data can be transferred and shared. For example, thenetwork 108 can be a local area network (LAN) or a wide-area network (WAN), such as the Internet. As another example, thenetwork 108 can be a telephone or cellular communications network. Thenetwork 108 can be implemented using various networking interfaces, for instance wireless networking interfaces (such as Wi-Fi, Bluetooth, or infrared) or wired networking interfaces (such as Ethernet or serial connection). Thenetwork 108 also can include combinations of more than one network, and can be implemented using one or more networking interfaces. - As described above, a
user 110 can position themobile device 102 on her body, and go about her daily life. As an example, as shown inFIG. 2A , themobile device 102 can be a wearable electronic device or wearable computer (e.g., a smart watch), that is secured to awrist 202 of theuser 110. Themobile device 102 can be secured to theuser 110, for example, through a band orstrap 204 that encircles thewrist 202. Further, the orientation of themobile device 102 can differ, depend on the location at which is it placed on the user's body and the user's positioning of her body. As an example, theorientation 206 of themobile device 102 is shown inFIG. 2A . Theorientation 206 can refer, for example, to a vector projecting from a front edge of the mobile device 102 (e.g., the y-axis shown inFIG. 2B ). - Although an example
mobile device 102 and an example position of themobile device 102 is shown, it is understood that these are merely illustrative examples. In practice, themobile device 102 can be any portable electronic device for receiving, processing, and/or transmitting data, including but not limited to cellular phones, smart phones, tablet computers, wearable computers (e.g., smart watches), and the like. As an example, themobile device 102 can be implemented according to thearchitecture 300 shown and described with respect toFIG. 3 . Further, in practice, themobile device 102 can be positioned on other locations of a user's body (e.g., arm, shoulder, leg, hip, head, abdomen, hand, foot, or any other location). - In an example usage of the
system 100, auser 110 positions themobile device 102 on her body, and goes about her daily life. This can include, for example, walking, running, bicycling, sitting, laying down, participating in a sport or athletic activity (e.g., basketball, volleyball, etc.), or any other physical activity. During this time, themobile device 102 collects sensor data regarding movement of themobile device 102, an orientation of themobile device 102, and/or other dynamic properties of themobile device 102 and/or theuser 110. - For instance, using the motion sensors 310 shown in FIG. X2 (e.g., one or more accelerometers), the
mobile device 102 can measure an acceleration experienced by the motion sensors 310, and correspondingly, the acceleration experienced by themobile device 102. Further, using the motion sensors 310 (e.g., one or more compasses, gyroscopes, inertia measurement units, etc.), themobile device 102 can measure an orientation of the motion sensors 310, and correspondingly, an orientation of themobile device 102. In some cases, the motion sensors 310 can collect data continuously or periodically over a period of time or in response to a trigger event. In some cases, the motion sensors 310 can collect motion data with respect to one or more specific directions relative to the orientation of themobile device 102. For example, the motion sensors 310 can collect sensor data regarding an acceleration of themobile device 102 with respect to the x-axis (e.g., a vector projecting from a side edge of themobile device 102, as shown inFIG. 2B ), the y-axis (e.g., a vector projecting from a front edge of themobile device 102, as shown inFIG. 2B ) and/or the z-axis (e.g., a vector projecting from a top surface or screen of themobile device 102, as shown inFIG. 2B ), where the x-axis, y-axis, and z-axis refer to a Cartesian coordinate system in a frame of reference fixed to the mobile device 102 (e.g., a “body” frame). - Based on this information, the
system 100 determines whether theuser 110 has fallen, and if so, whether theuser 110 may be in need of assistance. - As an example, the
user 110 may stumble fall to the ground. Further, after falling, theuser 110 may be unable to stand again on her own and/or may have suffered from an injury as a result of the fall. Thus, she may be in need of assistance, such as physical assistance in standing and/or recovering from the fall, medical attention to treat injuries sustained in the fall, or other help. In response, thesystem 100 can automatically notify others of the situation. For example, themobile device 102 can generate and transmit a notification to one or more of thecommunications devices 106 to notify one or more users 112 (e.g., caretakers, physicians, medical responders, emergency contact persons, etc.) of the situation, such that they can take action. As another example, themobile device 102 can generate and transmit a notification to one or more bystanders in proximity to the user (e.g., by broadcasting a visual and/or auditory alert), such they can take action. As another example, themobile device 102 can generate and transmit a notification to the server computer system 104 (e.g., to relay the notification to others and/or to store the information for future analysis). Thus, assistance can be rendered to theuser 110 more quickly and effectively. - In some cases, the
system 100 can determine that theuser 110 has experienced an external force, but has not fallen and is not in need of assistance. As an example, theuser 110 may experiences vibrations and/or jostling while riding a bicycle (e.g., due to roughness of a road or trail surface), but has not fallen and can continue biking without assistance from others. As an example, theuser 110 may have experience impacts during an athletic activity (e.g., bumped by another user while playing basketball, struck a ball or the ground while playing volleyball, etc.), but has not fallen due to the impact and is able to recover without assistance from others. Accordingly, thesystem 100 can refrain from generating and transmitting a notification to others. - In some cases, the
system 100 can determine that theuser 110 has fallen, but that the user is not in need of assistance. As an example, theuser 110 may have fallen as a part of an athletic activity (e.g., fallen while biking), but is able to recover without assistance from others. Accordingly, thesystem 100 can refrain from generating a notification and/or transmitting a notification to others. - In some cases, the
system 100 can make these determinations based on sensor data obtained before, during, and/or after an impact experienced by theuser 110. For example, themobile device 102 can collect sensor data (e.g., acceleration data, orientation data, location data, etc.), and thesystem 100 can use the sensor data to identify a point in time at which the user experienced an impact. Further, thesystem 100 can analyze the sensor data obtained during the impact, prior to the impact, and/or after the impact to determine whether the user has fallen, and if so, whether the user may be in need of assistance. - In some implementations, the
system 100 can make these determinations based on contextual information, such as the activity that the user was performing at or around the time the user experienced an impact or other force. This be can be beneficial, for example, in improving the accuracy and/or sensitivity by which thesystem 100 can detect falls. - For instance, the
system 100 can determine whether a user has fallen (and whether the user is in need of assistance) using different sets of rules or criteria, depending on the activity that the user was perform at or around the time that she experienced an impact or other force. As an example, thesystem 100 can determine that the user was performing a first activity (e.g., walking) and determine whether a user has fallen based on a first set of rules or criteria specific to that first activity. As another example, thesystem 100 can determine that the user was performing a second activity (e.g., biking) and determine whether a user has fallen based on a first set of rules or criteria specific to that second activity. As another example, thesystem 100 can determine that the user was performing a third activity (e.g., playing basketball) and determine whether a user has fallen based on a first set of rules or criteria specific to that third activity. Each set of rules or criteria can be specifically tailored to its corresponding activity, such that false positives and/or false negatives are reduced. - In some implementations, the
system 100 can utilize a first set of rules or criteria by default (e.g., a default set of rules or criteria for determining whether a user has fallen). Upon determining that the user is performing a particular activity, thesystem 100 can utilize a set of rules or criteria that is specific to that activity. Further, upon determining that the user has ceased performing that activity, thesystem 100 can revert to the first set of rules or criteria. - As an example, in some implementations, the
system 100 can utilize a default set of rules or criteria for detecting whether the user has fallen during frequent day to as activities, such as walking, climbing stairs, etc. Upon determining that the user is biking, thesystem 100 can utilize a specialized set of rules or criteria that are specific to detecting whether the user has fallen while biking. Further, upon determining that the user is participating in an activity in which user commonly experiences large impacts (e.g., volleyball, basketball, etc.), thesystem 100 can utilize another specialized set of rules or criteria that are specific to detecting whether the user has fallen while participating on that activity. Further, upon determining that the user is no longer participating in activity for which thesystem 100 has specialized sets of rules or criteria, thesystem 100 can revert to using the default set of rules or criteria for determining whether the user has fallen. - In some implementations, the
system 100 can determine whether a user has fallen (and whether the user is in need of assistance) using a state machine having several states, where each state corresponds to a different type of activity and a different corresponding set of criteria. - An
example state machine 300 is shown inFIG. 3 . In this example, the state machine includes three states 302 a-302 c, each corresponding to a different type of activity, and each being associated with a different set of rules or criteria for determining whether the user has fallen and/or whether the user is in need of assistance. - As an example, the
first state 302 a can correspond to a default activity. Further,first state 302 a can be associated with a default set of rules or criteria for determining whether a user has fallen and/or whether the user is in need of assistance. In some implementations, the default activity can correspond to one or more of walking, jogging, running, standing, and/or sitting. - As another example, the
second state 302 b can correspond to a biking activity. Further, thesecond state 302 b can be associated with a set of rules or criteria for determining whether a user has fallen and/or whether the user is in need of assistance, specifically in the context of biking. - As another example, the
second state 302 c can correspond to an activity in which user commonly experiences large impacts (e.g., volleyball, basketball, etc.). Further, thethird state 302 c can be associated with a set of rules or criteria for determining whether a user has fallen and/or whether the user is in need of assistance, specifically in the context of high impact activities. - In an example operation, the
system 100 is initially set to a default state (e.g., thefirst state 302 a) and determines whether a user has fallen and/or whether the user is in need of assistance based on the default set of rules or criteria that is associated with that state. - Upon determining that the user is performing a different activity, the
system 100 transitions to the state corresponding to that activity, and determines whether a user has fallen and/or whether the user is in need of assistance based on the set of rules or criteria that is associated with that new state. - For example, upon determining that the user is biking, the
system 100 can transition from thefirst state 302 a to thesecond state 302 b, and can determine whether a user has fallen and/or whether the user is in need of assistance based on the set of rules or criteria that is associated with thesecond state 302 b. - For example, upon determining that the user has ceased biking and is instead playing basketball, the
system 100 can transition from thesecond state 302 b to thethird state 302 c, and can determine whether a user has fallen and/or whether the user is in need of assistance based on the set of rules or criteria that is associated with thethird state 302 c. - Upon determining that the user is no longer performing an specialized activity (e.g., an activity that is not associated with a state other than the default
first state 302 a), thesystem 100 transitions back to the defaultfirst state 302 a, and determines whether a user has fallen and/or whether the user is in need of assistance based on the default set of rules or criteria that is associated with that state. - Although the state machine 200 shown in
FIG. 2 includes three states, this is merely an illustrative example. In practice, a state machine can include any number of states corresponding to any number of activities (and in turn, any number of different sets of rules or criteria). - In implementations, the
system 100 can determine the type of activity being performed by a user based on sensor data obtained by themobile device 102, such as location data, acceleration data, and/or orientation data For example, each type of activity may be identified by detecting certain characteristics or combinations of characteristics the sensor data that are indicative of that type of activity. For example, a first type of activity may correspond to sensor data having a first set of characteristics, a second type of activity may correspond to sensor data having a second set of characteristics, a third type of activity may correspond to sensor data having a third set of characteristics, and so forth. Thesystem 100 can identify type of activity being performed by a user by obtaining sensor data from themobile device 102, and determining that the sensor data exhibits a particular set of characteristics. - As an example, the
system 100 can determine whether the user is biking based on the distance that a user traveled and/or speed that which the user traveled prior to the impact (e.g., based on output from a location sensor, such as a GPS sensor). For example, a greater distance and/or a higher speed (e.g., greater than certain threshold values) may indicate that the user is biking, whereas a lower distance and/or a lower speed (e.g., less than certain threshold values) may indicate that that the user is walking. - As another example, the
system 100 can determine whether the user is biking based on sensor measurements from an accelerometer and/or orientation sensor (e.g., gyroscope) of themobile device 102. For example, a user might experience certain types of impacts and/or change the orientation of her body (e.g., her wrist) in certain ways while biking, and experience different types of impacts and/or change the orientation of her body in different ways while walking. - As another example, the
system 100 can determine whether the user is performing an activity in which user commonly experiences large impacts (e.g., volleyball, basketball, etc.) based on sensor measurements from an accelerometer and/or orientation sensor (e.g., gyroscope) of themobile device 102. For example, when a user plays volleyball, a user may commonly move her arm or wrist (to which themobile device 102 is attached) according to a distinctive pattern. Thesystem 100 can determine, based on the sensor data, whether the user is moving her arm or wrist according to that pattern, and if so, determine that the user is playing volleyball. - In some implementations, the
system 100 can determine whether the user is performing a particular activity based on manual user input. For example, prior to or during the performance of an activity, a user can manually identify that activity to themobile device 102 and/orsystem 100. For example, prior to biking, a user can input data (e.g., to the mobile device 102) indicating that she is about to go biking. Based on the user input, thesystem 100 can determine that the user will be biking. In some implementations, a user can provide input to amobile device 102 by selecting a particular activity (e.g., from a list or menu on candidate activities). In some implementations, a user can provide input to amobile device 102 by selecting a particular application or feature of themobile device 102 that is specific to or otherwise associated with that activity (e.g., an exercise application or feature). - Although example techniques for identifying a user's activity are described herein, these are merely illustrative examples. In practice, other techniques also can be performed to identify a user's activity, either instead of or in addition to those described herein.
- As described above, the
system 100 can utilize a context-specific set or rules or criteria for determining whether a user has fallen (and whether the user is in need of assistance) while the user performs certain activities, such as biking. - In general, the context-specific sets or rules or criteria can pertain to sensor data obtained by the
mobile device 102 worn by the user. As an example, the sets or rules or criteria can pertain to location data obtained by one or more location sensors (e.g., one or more GPS sensors), acceleration data (e.g., impact data) obtained by one or more accelerometers, and/or orientation data obtained by one or more orientation sensors (e.g., gyroscopes, inertial measurement units, etc.). Certain combinations of measurements may indicate that, in certain contexts, a user has fallen and may be in need of assistance. - As an example, a
mobile device 102 can be worn by a user on her wrist by biking. Further, themobile device 102 can obtain sensor data representing the orientation of the mobile device 102 (and correspondingly, the orientation of the user's wrist or arm) and the acceleration experienced by the mobile device (e.g., representing movements of the user's wrist or arm) prior to, during, and after an impact. In a biking context, sensor measurements indicating that user has (i) changed the orientation of her wrist by a large degree (e.g., greater than a threshold amount) and (ii) moved her wrist or arm by a large degree (e.g., greater than a threshold amount) may be indicative that the user has fallen. - In contrast, sensor measurements indicating that user has (i) changed the orientation of her wrist by a small degree (e.g., not greater than a threshold amount) and (ii) moved her wrist or arm by a large degree (e.g., greater than a threshold amount) may be indicative that the user is biking on rough terrain but has not fallen.
- Further, sensor measurements indicating that user has (i) changed the orientation of her wrist by a large degree (e.g., not greater than a threshold amount) and (ii) moved her wrist or arm by a small degree (e.g., not greater than a threshold amount) may be indicative that the user is signaling or performing a gesture, and has not fallen.
- Further, sensor measurements indicating that user has (i) changed the orientation of her wrist by a small degree (e.g., not greater than a threshold amount) and (ii) moved her wrist or arm by a small degree (e.g., not greater than a threshold amount) may be indicative that the user is static and has not fallen.
- As another example, in a biking context, sensors measurements indicating that the user (i) has traveled a large distance (e.g., greater than a threshold distance) prior to an impact, (ii) experienced highly directional impacts over time (e.g., a variation, spread, or range of impact directions that is less than a threshold level), and (iii) rotated her wrist a small amount (e.g., less than a threshold amount) may indicate that the user is biking normally, and has not fallen. However, sensor measurement indicating that the user (i) has traveled a short distance (e.g., less than a threshold distance) after an impact, (ii) experienced impacts with respect to a wide range of directions over time (e.g., a variation, spread, or range of impact directions that is greater than a threshold level), and (iii) rotated her wrist a large amount (e.g., greater than a threshold amount) may indicate that the user has fallen while biking.
- For instance,
FIG. 4A showssensor data 400 representing the orientation of a mobile device that is worn of a user's wrist while bicycling, measured over a 4 second time window (e.g., extending from two seconds prior to the user experiencing an impact attime 0, until two seconds after the user experiencing the impact). In this example, the orientation of the mobile device (and in turn, the orientation of the user's hand and/or wrist) is relatively stable during the time prior to the impact. However, upon the user experiencing the impact, the orientation of the mobile device exhibits a large angular change over a short time interval (e.g., approximately 0.1 second). Further, the orientation of the mobile device exhibits a large angular change over the entire time window. - These characteristics may be indicative of a fall. For example, a
system 100 can determine that the user has fallen from her bicycle if (i) the angular change in the orientation of the mobile device over the time window (e.g., a 4 second window) is greater than a first threshold amount θ1, and (ii) the angular change in the orientation of the mobile device over a subset of that time window (e.g., a 0.1 second subset of the 4 second time window) is greater than a second threshold amount θ2. Otherwise, thesystem 100 can determine that the user has not fallen from her bicycle. -
FIG. 4B showsadditional sensor data 450 representing the orientation of a mobile device that is worn of a user's wrist while bicycling, measured over a 4 second time window (e.g., extending from two seconds prior to the user experiencing an impact attime 0, until two seconds after the user experiencing the impact). In this example, the orientation of the mobile device (and in turn, the orientation of the user's hand and/or wrist) is relatively stable during the entirety time window. - These characteristics may indicate that the user has not fallen. For example, a
system 100 can determine that the user has not fallen from her bicycle if (i) the angular change in the orientation of the mobile device over the time window (e.g., a 4 second window) is not greater than a first threshold amount θ1, and/or (ii) the angular change in the orientation of the mobile device over a subset of that time window (e.g., a 0.1 second subset of the 4 second time window) is not greater than a second threshold amount θ2. - In practice, the time window, the subset of the time window, and the threshold amounts can differ, depending on the implementation. For example, the time window, the subset of the time window, and the threshold amounts can be tunable values that are selected based on experimental studies of the characteristics of users' movements while riding bicycles.
- As another example, a
system 100 can determine that a user has fallen by biking upon receiving sensor measurements indicating that the user (i) experienced vibrations that are characteristic of bicycling prior to the impact, and (ii) has not experienced vibrations that are characteristic of bicycling within a particular time interval after the impact (e.g., within a threshold time interval T). In contrast, thesystem 100 determine that a user has not fallen upon receiving sensor measurements indicating that the user (i) experienced vibrations that are characteristic of bicycling prior to the impact, and (ii) has again experienced vibrations that are characteristic of bicycling within the particular time interval after the impact (e.g., within the threshold time interval T). - As another example, while biking, a user may orient her wrist differently, depending on the configuration of her bicycle's handlebars. The
system 100 can infer the configuration of the handlebars, and apply different sets of rules or criteria for each configuration. - For instance,
FIG. 5 shows anexample bicycle 502 having horizontal (or approximately horizontal) handlebars 504. In this example, theuser 110 is wearing themobile device 102 on one of her wrists, and is grasping thehandlebars 504 with her hands. The x-axis and y-axis of themobile device 102 are shown extending from themobile device 102. The y-direction extends along (or approximately along) thehandlebars 504, the x-direction extends along (or approximately along) the user's arm, and the z-direction (not shown) extends perpendicular to a face of the mobile device 102). Sensor measurements indicating that the user experienced a high intensity impact (e.g., greater than a threshold level) in a Y-direction may indicate that the user has fallen while biking. However, sensor measurements indicating that the user experienced a low intensity impact (e.g., less than the threshold level) in the Y-direction may indicate that the user is biking normally, and has not fallen. - As an example,
FIG. 6A showssensor data 600 representing the acceleration of a mobile device that is worn of a user's wrist while bicycling, measured in the x-direction and the y-direction over a 1.2 second time window (e.g., extending from 0.6 seconds prior to the user experiencing an impact attime 0, until 0.6 seconds after the user experiencing the impact). In this example, the mobile device (and in turn, the user) experienced a high intensity impact in both the x-direction and y-direction (e.g., above a threshold intensity level), which may be characteristic of the user falling. - As another example,
FIG. 6B showssensor data 620 representing the acceleration of a mobile device that is worn of a user's wrist while bicycling, measured in the x-direction and y-direction over a 1.2 second time window (e.g., extending from 0.6 seconds prior to the user experiencing an impact attime 0, until 0.6 seconds after the user experiencing the impact). In this example, the mobile device (and in turn, the user) experienced a high intensity impact in the x-direction (e.g., in the direction along the user's arm). However, the mobile device (and in turn, the user) did not experience a high intensity impact in the y-direction (e.g., in a direction along the handlebars). This may be indicative of the user not falling. - For instance, a
system 100 can determine that the user has fallen from her bicycle if (i) the intensity of the impact experienced in a x-direction is greater than a first threshold amount I1, and (ii) the intensity of the impact experienced in a y-direction is greater than a second threshold amount I2. Otherwise, thesystem 100 can determine that the user not fallen. In practice, the threshold amounts can differ, depending on the implementation. For example, the threshold amounts can be tunable values that are selected based on experimental studies of the characteristics of users' movements while riding bicycles. - Further,
FIG. 7 shows anotherexample bicycle 702 having vertical (or approximately vertical) handlebars 704. In this example, theuser 110 is wearing themobile device 102 on one of her wrists, and is grasping the handlebars 454 with her hands. The x-axis and y-axis of themobile device 102 are shown extending from themobile device 102. The y-direction extends along (or approximately along) thehandlebars 704, the x-direction extends along (or approximately along) the user's arm, and the z-direction (not shown) extends perpendicular to a face of the mobile device 102). Sensor measurements indicating that the user (i) has moved her hand chaotically, (ii) experienced a high intensity impact (e.g., greater than a first threshold level I1) in a Y-direction, and (iii) a high intensity impact (e.g., greater than a second threshold level I2) in a Z-direction may indicate that the user has fallen while biking. However, sensor measurement indicating that the user (i) has maintained her hand is a stable vertical direction, (ii) experienced a high intensity impact (e.g., greater than the first threshold level I1) in a Y-direction, and (iii) a low intensity impact (e.g., lower than the second threshold level I2) in a Z-direction may indicate that the user is biking normally, and has not fallen. - For example, a
system 100 can determine that a user has fallen by biking upon receiving sensor measurements indicating that (i) the variation, spread, or range of the directions of orientation of themobile device 102 is greater than a threshold level (e.g., indicative of chaotic movement by the user), (ii) the mobile device experienced a high intensity impact (e.g., greater than the threshold level I1) in a Y-direction, and (iii) the mobile device experience a high intensity impact (e.g., greater than the second threshold level I2) in a Z-direction. - As another example, a
system 100 can determine that user has maintained her hand is a stable vertical direction by determining that (i) the variation, spread, or range of the orientation of themobile device 102 is not greater than a threshold level, and (ii) the angle between the y-direction of themobile device 102 and the vertical direction is less than a threshold angle θT. Further, upon additionally determining that (i) the mobile device experienced a high intensity impact (e.g., greater than the threshold level I1) in a Y-direction, and (iii) the mobile device experience a low intensity impact (e.g., not greater than the second threshold level I2) in a Z-direction, thesystem 100 can determine that this user has not fallen while biking. - As described above, upon determining that a user has fallen and requires assistance, the
mobile device 102 can generate and transmit a notification to one ormore communications devices 106 to notify one or more users 112 (e.g., caretakers, physicians, medical responders, emergency contact persons, etc.) of the situation, such that they can take action. In some implementations, notification can be generated and transmitted upon the satisfaction of certain criteria in order to reduce the occurrence of false positives. - For instance,
FIG. 8 shows anexample process 800 for generating and transmitting a notification in response to a user falling. - In the
process 800, a system (e.g., thesystem 100 and/or the mobile device 102) determines whether a user was biking before experiencing an impact (block 802). The system can make this determination based on sensor data obtained by a mobile device worn by the user (e.g., as described above). - If the system determines that the user was not biking, the system can detect whether a user has fallen using a default technique (block 850). For example, referring to
FIG. 3 , the system can detect whether a user has fallen according to a default set of rules of criteria that are not specific to biking. - If the system determines that the user was biking, the system determines whether the impact has characteristics off a biking fall (block 802). The system can make this determination based on sensor data obtained by a mobile device worn by the user (e.g., as described above).
- If the system determines that the impact does not have characteristics off a biking fall, the system refrains from generating and transmitting a notification (block 812).
- If the system determines that the impact has the characteristics of a biking fall, the system determines whether the user has stopped biking after the impact (block 806). The system can make this determination based on sensor data obtained by a mobile device worn by the user (e.g., as described above).
- If the system determines that the user has not stopped biking after the impact, the system refrains from generating and transmitting a notification (block 812).
- If the system determines that the user has stopped biking after the impact, the system determines whether the user has remained sufficiently still for a period of time (e.g., a one minute time interval) after the impact (block 808). The system can make this determination based on sensor data obtained by a mobile device worn by the user (e.g., by determining whether the mobile device has moved more than a threshold distance, changed its orientation by more than a threshold angle, moved for a length of time greater than a threshold amount of time, etc.).
- If the system determines that the user has not remained sufficiently still for the period of time, the system refrains from generating and transmitting a notification (block 812).
- If the system determines that the user has remained sufficiently still for the period of time, the system generates and transmit a notification (block 810).
- In some implementations, upon detecting that a user has fallen, the
mobile device 102 can determine whether a user remains immobile after the fall for a particular time interval (e.g., 30 seconds). Upon determining that user has remained immobile, themobile device 102 present an alert notification to the user, including an option to generate and transmit a notification (e.g., an emergency responder) and an option to refrain from generating and training a notification. An example of this alert notification is shown inFIG. 9A . - If the user does not provide any input within a particular time interval (e.g., within 60 seconds after the fall), the
mobile device 102 can present an alert notification to the user showing a count down, and indicating that a notification will be generated and transmitted upon expiration of the count down, absent input otherwise by the user. An example of this alert notification is shown inFIG. 9B . - Upon expiration of the count down without input from the user, the
mobile device 102 generates and transmits a notification (e.g., as shown inFIG. 9C ). - This technique can be beneficial, for example, in further reducing the occurrence of false positives and reducing the likelihood that notifications are transmitted to others (e.g., emergency services) in error when the user does not actually require assistance.
- An
example process 1000 for determining whether a user has fallen and/or may be in need of assistance using a mobile device is shown inFIG. 1000 . Theprocess 1000 can be performed for example, using themobile device 102 and/or thesystem 100 shown inFIGS. 1 and 2 . In some cases, some or all of theprocess 1000 can be performed by a co-processor of the mobile device. The co-processor can be configured to receive motion data obtained from one or more sensors, process the motion data, and provide the processed motion data to one or more processors of the mobile device. - In the
process 1000, a mobile device receives sensor data obtained by one or more sensor over a time period (block 1002). The one or more sensors are worn by a user. - In some implementations, the mobile device can be a wearable mobile device, such as a smart watch.
- In some implementations, at least some of the one or more sensors can be disposed on or in the mobile device. In some implementations, at least some of the one or more sensors are remote from the mobile device. For example, the mobile device can be a smart phone, and the sensors can be disposed on a smart watch that is communicatively coupled to the smart phone.
- In general, the sensor data can include one or more types of data. For example, the sensor data can include location data obtained by one or more location sensors of the mobile device. As another example, the sensor data can include acceleration data obtained by one or more acceleration sensors of the mobile device. As another example, the sensor data can include orientation data obtained by one or more orientation sensors of the mobile device.
- Further, the mobile device determines a context of the user based on the sensor data (block 1004). In some implementations, the context can correspond to a type of activity performed by the user during the time period. Example contexts include bicycling, walking, running, jigging, playing a sport (e.g., basketball, volley, etc.), or any other activity that may be performed by a user.
- Further, the mobile device obtains a set of rules for processing the sensor data based on the context (block 1006). The set of rules is specific to the context.
- Further, the mobile device determines a likelihood that the user has fallen and/or a likelihood that the user requires assistance based on the sensor data and the set of rules (block 1008).
- As described above, the mobile device determines a likelihood that the user has fallen and/or a likelihood that the user requires assistance using sets of rules that are specific to the context. As illustrative examples, sets of rules for a bicycling context are described above.
- As an example, determining the likelihood that the user has fallen and/or the likelihood that the user requires assistance can include (i) determining, based on the sensor data, that a distance traveled by the user prior over the period of time is greater than a first threshold value, and (ii) determining, based on the sensor data, that a variation in a direction of impacts experienced by the user over the period of time is less than a second threshold value, (iii) determining, based on the sensor data, that a rotation of the user's wrist over the period of time is less than a third threshold value, and (iv) determining that the user has fallen and/or requires assistance based on the determination that the distance traveled by the user prior over the period of time is greater than the first threshold value, the determination that the variation in a direction of impacts experienced by the user over the period of time is less than the second threshold value, and the determination that the rotation of the user's wrist over the period of time is less than the third threshold value.
- As another example, determining the likelihood that the user has fallen and/or the likelihood that the user requires assistance can include (i) determining, based on the sensor data, that a magnitude of an impact experienced by the user over the period of time in a first direction is greater than a first threshold value, and (ii) determining that the user has fallen and/or requires assistance based on the determination that the magnitude of the impact experienced by the user over the period of time in the first direction is greater than the first threshold value.
- As another example, determining the likelihood that the user has fallen and/or the likelihood that the user requires assistance can include (i) determining, based on the sensor data, that a change in an orientation of the user's hand over the period of time is greater than a first threshold value, (ii) determining, based on the sensor data, that a magnitude of an impact experienced by the user over the period of time in a first direction is greater than a second threshold value, wherein the first direction is orthogonal to the second threshold value, (iii) determining, based on the sensor data, that a magnitude of an impact experienced by the user over the period of time in a second direction is greater than a third threshold value, and (iv) determining that the user has fallen and/or requires assistance based on the determination that the change in an orientation of the user's hand over the period of time is greater than the first threshold value, the determination that the magnitude of the impact experienced by the user over the period of time in the first direction is greater than the second threshold value, and the determination that the magnitude of an impact experienced by the user over the period of time in the second direction is greater than the third threshold value.
- Although example sets of rules for a bicycling context are described above, in practice, other sets of rules also can be used for a bicycling context, either instead of or in addition to those described above. Further, other sets of rules can be used for other contexts, such as walking, running, jogging, playing a sport, etc.
- Further, the mobile device generates one or more notifications based on the likelihood that the user has fallen and/or the likelihood that the user requires assistance (block 1010).
- In some implementations, generating the one or more notifications can include transmitting a first notification to a communications device remote from the mobile device. The first notification can include an indication that the user has fallen and/or an indication that the user requires assistance. In some implementations, the communications device can be an emergency response system.
- In some implementations, the mobile device can perform at least a portion of the
process 1000 according to a different context of the user. For example, the mobile device can receive second sensor data obtained by the one or more sensor over a second time period. Further, the mobile device can determine a second context of the user based on the second sensor data, and obtain a second set of rules for processing the sensor data based on the second context, where the second set of rules is specific to the second context. Further, the mobile device can determine at least one of a likelihood that the user has fallen and/or a likelihood that the user requires assistance based on the sensor data and the second set of rules. Further, the mobile device can generate one or more second notifications based on at least one of the likelihood that the user has fallen or the likelihood that the user requires assistance. -
FIG. 11 is a block diagram of anexample device architecture 1100 for implementing the features and processes described in reference toFIGS. 1-10 . For example, thearchitecture 1100 can be used to implement themobile device 102, theserver computer system 104, and/or one or more of thecommunications devices 106.Architecture 1100 may be implemented in any device for generating the features described in reference toFIGS. 1-10 , including but not limited to desktop computers, server computers, portable computers, smart phones, tablet computers, game consoles, wearable computers, set top boxes, media players, smart TVs, and the like. - The
architecture 1100 can include amemory interface 1102, one ormore data processor 1104, one ormore data co-processors 1174, and aperipherals interface 1106. Thememory interface 1102, the processor(s) 1104, the co-processor(s) 1174, and/or the peripherals interface 1106 can be separate components or can be integrated in one or more integrated circuits. One or more communication buses or signal lines may couple the various components. - The processor(s) 1104 and/or the co-processor(s) 1174 can operate in conjunction to perform the operations described herein. For instance, the processor(s) 1104 can include one or more central processing units (CPUs) that are configured to function as the primary computer processors for the
architecture 1100. As an example, the processor(s) 1104 can be configured to perform generalized data processing tasks of thearchitecture 1100. Further, at least some of the data processing tasks can be offloaded to the co-processor(s) 1174. For example, specialized data processing tasks, such as processing motion data, processing image data, encrypting data, and/or performing certain types of arithmetic operations, can be offloaded to one or more specialized co-processor(s) 1174 for handling those tasks. In some cases, the processor(s) 1104 can be relatively more powerful than the co-processor(s) 1174 and/or can consume more power than the co-processor(s) 1174. This can be useful, for example, as it enables the processor(s) 1104 to handle generalized tasks quickly, while also offloading certain other tasks to co-processor(s) 1174 that may perform those tasks more efficiency and/or more effectively. In some cases, a co-processor(s) can include one or more sensors or other components (e.g., as described herein), and can be configured to process data obtained using those sensors or components, and provide the processed data to the processor(s) 1104 for further analysis. - Sensors, devices, and subsystems can be coupled to
peripherals interface 1106 to facilitate multiple functionalities. For example, amotion sensor 1110, alight sensor 1112, and aproximity sensor 1114 can be coupled to the peripherals interface 1106 to facilitate orientation, lighting, and proximity functions of thearchitecture 1100. For example, in some implementations, alight sensor 1112 can be utilized to facilitate adjusting the brightness of atouch surface 1146. In some implementations, amotion sensor 1110 can be utilized to detect movement and orientation of the device. For example, themotion sensor 1110 can include one or more accelerometers (e.g., to measure the acceleration experienced by themotion sensor 1110 and/or thearchitecture 1100 over a period of time), and/or one or more compasses or gyros (e.g., to measure the orientation of themotion sensor 1110 and/or the mobile device). In some cases, the measurement information obtained by themotion sensor 1110 can be in the form of one or more a time-varying signals (e.g., a time-varying plot of an acceleration and/or an orientation over a period of time). Further, display objects or media may be presented according to a detected orientation (e.g., according to a “portrait” orientation or a “landscape” orientation). In some cases, amotion sensor 1110 can be directly integrated into a co-processor 1174 configured to processes measurements obtained by themotion sensor 1110. For example, aco-processor 1174 can include one more accelerometers, compasses, and/or gyroscopes, and can be configured to obtain sensor data from each of these sensors, process the sensor data, and transmit the processed data to the processor(s) 1104 for further analysis. - Other sensors may also be connected to the
peripherals interface 1106, such as a temperature sensor, a biometric sensor, or other sensing device, to facilitate related functionalities. As an example, as shown inFIG. 11 , thearchitecture 1100 can include aheart rate sensor 11112 that measures the beats of a user's heart. Similarly, these other sensors also can be directly integrated into one or more co-processor(s) 1174 configured to process measurements obtained from those sensors. - A location processor 1115 (e.g., a GNSS receiver chip) can be connected to the peripherals interface 1106 to provide geo-referencing. An electronic magnetometer 1116 (e.g., an integrated circuit chip) can also be connected to the peripherals interface 1106 to provide data that may be used to determine the direction of magnetic North. Thus, the
electronic magnetometer 1116 can be used as an electronic compass. - A
camera subsystem 1120 and an optical sensor 1122 (e.g., a charged coupled device [CCD] or a complementary metal-oxide semiconductor [CMOS] optical sensor) can be utilized to facilitate camera functions, such as recording photographs and video clips. - Communication functions may be facilitated through one or
more communication subsystems 1124. The communication subsystem(s) 1124 can include one or more wireless and/or wired communication subsystems. For example, wireless communication subsystems can include radio frequency receivers and transmitters and/or optical (e.g., infrared) receivers and transmitters. As another example, wired communication system can include a port device, e.g., a Universal Serial Bus (USB) port or some other wired port connection that can be used to establish a wired connection to other computing devices, such as other communication devices, network access devices, a personal computer, a printer, a display screen, or other processing devices capable of receiving or transmitting data. - The specific design and implementation of the
communication subsystem 1124 can depend on the communication network(s) or medium(s) over which thearchitecture 1100 is intended to operate. For example, thearchitecture 1100 can include wireless communication subsystems designed to operate over a global system for mobile communications (GSM) network, a GPRS network, an enhanced data GSM environment (EDGE) network, 802.x communication networks (e.g., Wi-Fi, Wi-Max), code division multiple access (CDMA) networks, NFC and a Bluetooth™ network. The wireless communication subsystems can also include hosting protocols such that thearchitecture 1100 can be configured as a base station for other wireless devices. As another example, the communication subsystems may allow thearchitecture 1100 to synchronize with a host device using one or more protocols, such as, for example, the TCP/IP protocol, HTTP protocol, UDP protocol, and any other known protocol. - An
audio subsystem 1126 can be coupled to aspeaker 1128 and one or more microphones 1130 to facilitate voice-enabled functions, such as voice recognition, voice replication, digital recording, and telephony functions. - An I/
O subsystem 1140 can include atouch controller 1142 and/or other input controller(s) 1144. Thetouch controller 1142 can be coupled to atouch surface 1146. Thetouch surface 1146 and thetouch controller 1142 can, for example, detect contact and movement or break thereof using any of a number of touch sensitivity technologies, including but not limited to capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with thetouch surface 1146. In one implementation, thetouch surface 1146 can display virtual or soft buttons and a virtual keyboard, which can be used as an input/output device by the user. - Other input controller(s) 1144 can be coupled to other input/
control devices 1148, such as one or more buttons, rocker switches, thumb-wheel, infrared port, USB port, and/or a pointer device such as a stylus. The one or more buttons (not shown) can include an up/down button for volume control of thespeaker 1128 and/or themicrophone 11110. - In some implementations, the
architecture 1100 can present recorded audio and/or video files, such as MP3, AAC, and MPEG video files. In some implementations, thearchitecture 1100 can include the functionality of an MP3 player and may include a pin connector for tethering to other devices. Other input/output and control devices may be used. - A
memory interface 1102 can be coupled to amemory 1150. Thememory 1150 can include high-speed random access memory or non-volatile memory, such as one or more magnetic disk storage devices, one or more optical storage devices, or flash memory (e.g., NAND, NOR). Thememory 1150 can store anoperating system 1152, such as Darwin, RTXC, LINUX, UNIX, OS X, WINDOWS, or an embedded operating system such as VxWorks. Theoperating system 1152 can include instructions for handling basic system services and for performing hardware dependent tasks. In some implementations, theoperating system 1152 can include a kernel (e.g., UNIX kernel). - The
memory 1150 can also storecommunication instructions 1154 to facilitate communicating with one or more additional devices, one or more computers or servers, including peer-to-peer communications. Thecommunication instructions 1154 can also be used to select an operational mode or communication medium for use by the device, based on a geographic location (obtained by the GPS/Navigation instructions 1168) of the device. Thememory 1150 can include graphicaluser interface instructions 1156 to facilitate graphic user interface processing, including a touch model for interpreting touch inputs and gestures;sensor processing instructions 1158 to facilitate sensor-related processing and functions;phone instructions 1160 to facilitate phone-related processes and functions;electronic messaging instructions 1162 to facilitate electronic-messaging related processes and functions;web browsing instructions 1164 to facilitate web browsing-related processes and functions;media processing instructions 1166 to facilitate media processing-related processes and functions; GPS/Navigation instructions 1169 to facilitate GPS and navigation-related processes;camera instructions 1170 to facilitate camera-related processes and functions; andother instructions 1172 for performing some or all of the processes described herein. - Each of the above identified instructions and applications can correspond to a set of instructions for performing one or more functions described herein. These instructions need not be implemented as separate software programs, procedures, or modules. The
memory 1150 can include additional instructions or fewer instructions. Furthermore, various functions of the device may be implemented in hardware and/or in software, including in one or more signal processing and/or application specific integrated circuits (ASICs). - The features described may be implemented in digital electronic circuitry or in computer hardware, firmware, software, or in combinations of them. The features may be implemented in a computer program product tangibly embodied in an information carrier, e.g., in a machine-readable storage device, for execution by a programmable processor; and method steps may be performed by a programmable processor executing a program of instructions to perform functions of the described implementations by operating on input data and generating output.
- The described features may be implemented advantageously in one or more computer programs that are executable on a programmable system including at least one programmable processor coupled to receive data and instructions from, and to transmit data and instructions to, a data storage system, at least one input device, and at least one output device. A computer program is a set of instructions that may be used, directly or indirectly, in a computer to perform a certain activity or bring about a certain result. A computer program may be written in any form of programming language (e.g., Objective-C, Java), including compiled or interpreted languages, and it may be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment.
- Suitable processors for the execution of a program of instructions include, by way of example, both general and special purpose microprocessors, and the sole processor or one of multiple processors or cores, of any kind of computer. Generally, a processor will receive instructions and data from a read-only memory or a random access memory or both. The essential elements of a computer are a processor for executing instructions and one or more memories for storing instructions and data. Generally, a computer may communicate with mass storage devices for storing data files. These mass storage devices may include magnetic disks, such as internal hard disks and removable disks; magneto-optical disks; and optical disks. Storage devices suitable for tangibly embodying computer program instructions and data include all forms of non-volatile memory, including by way of example semiconductor memory devices, such as EPROM, EEPROM, and flash memory devices; magnetic disks such as internal hard disks and removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks. The processor and the memory may be supplemented by, or incorporated in, ASICs (application-specific integrated circuits).
- To provide for interaction with a user the features may be implemented on a computer having a display device such as a CRT (cathode ray tube) or LCD (liquid crystal display) monitor for displaying information to the author and a keyboard and a pointing device such as a mouse or a trackball by which the author may provide input to the computer.
- The features may be implemented in a computer system that includes a back-end component, such as a data server or that includes a middleware component, such as an application server or an Internet server, or that includes a front-end component, such as a client computer having a graphical user interface or an Internet browser, or any combination of them. The components of the system may be connected by any form or medium of digital data communication such as a communication network. Examples of communication networks include a LAN, a WAN and the computers and networks forming the Internet.
- The computer system may include clients and servers. A client and server are generally remote from each other and typically interact through a network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
- One or more features or steps of the disclosed embodiments may be implemented using an Application Programming Interface (API). An API may define on or more parameters that are passed between a calling application and other software code (e.g., an operating system, library routine, function) that provides a service, that provides data, or that performs an operation or a computation.
- The API may be implemented as one or more calls in program code that send or receive one or more parameters through a parameter list or other structure based on a call convention defined in an API specification document. A parameter may be a constant, a key, a data structure, an object, an object class, a variable, a data type, a pointer, an array, a list, or another call. API calls and parameters may be implemented in any programming language. The programming language may define the vocabulary and calling convention that a programmer will employ to access functions supporting the API.
- In some implementations, an API call may report to an application the capabilities of a device running the application, such as input capability, output capability, processing capability, power capability, communications capability, etc.
- As described above, some aspects of the subject matter of this specification include gathering and use of data available from various sources to improve services a mobile device can provide to a user. The present disclosure contemplates that in some instances, this gathered data may identify a particular location or an address based on device usage. Such personal information data can include location-based data, addresses, subscriber account identifiers, or other identifying information.
- The present disclosure further contemplates that the entities responsible for the collection, analysis, disclosure, transfer, storage, or other use of such personal information data will comply with well-established privacy policies and/or privacy practices. In particular, such entities should implement and consistently use privacy policies and practices that are generally recognized as meeting or exceeding industry or governmental requirements for maintaining personal information data private and secure. For example, personal information from users should be collected for legitimate and reasonable uses of the entity and not shared or sold outside of those legitimate uses. Further, such collection should occur only after receiving the informed consent of the users. Additionally, such entities would take any needed steps for safeguarding and securing access to such personal information data and ensuring that others with access to the personal information data adhere to their privacy policies and procedures. Further, such entities can subject themselves to evaluation by third parties to certify their adherence to widely accepted privacy policies and practices.
- In the case of advertisement delivery services, the present disclosure also contemplates embodiments in which users selectively block the use of, or access to, personal information data. That is, the present disclosure contemplates that hardware and/or software elements can be provided to prevent or block access to such personal information data. For example, in the case of advertisement delivery services, the present technology can be configured to allow users to select to “opt in” or “opt out” of participation in the collection of personal information data during registration for services.
- Therefore, although the present disclosure broadly covers use of personal information data to implement one or more various disclosed embodiments, the present disclosure also contemplates that the various embodiments can also be implemented without the need for accessing such personal information data. That is, the various embodiments of the present technology are not rendered inoperable due to the lack of all or a portion of such personal information data. For example, content can be selected and delivered to users by inferring preferences based on non-personal information data or a bare minimum amount of personal information, such as the content being requested by the device associated with a user, other non-personal information available to the content delivery services, or publicly available information.
- A number of implementations have been described. Nevertheless, it will be understood that various modifications may be made. Elements of one or more implementations may be combined, deleted, modified, or supplemented to form further implementations. As yet another example, the logic flows depicted in the figures do not require the particular order shown, or sequential order, to achieve desirable results. In addition, other steps may be provided, or steps may be eliminated, from the described flows, and other components may be added to, or removed from, the described systems.
- Accordingly, other implementations are within the scope of the following claims.
Claims (18)
Priority Applications (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US17/942,018 US20230084356A1 (en) | 2021-09-10 | 2022-09-09 | Context Aware Fall Detection Using a Mobile Device |
| US18/617,381 US12361810B2 (en) | 2021-09-10 | 2024-03-26 | Context aware fall detection using a mobile device |
| US19/242,847 US20250316155A1 (en) | 2021-09-10 | 2025-06-18 | Context Aware Fall Detection Using a Mobile Device |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US202163242998P | 2021-09-10 | 2021-09-10 | |
| US17/942,018 US20230084356A1 (en) | 2021-09-10 | 2022-09-09 | Context Aware Fall Detection Using a Mobile Device |
Related Child Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US18/617,381 Continuation US12361810B2 (en) | 2021-09-10 | 2024-03-26 | Context aware fall detection using a mobile device |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| US20230084356A1 true US20230084356A1 (en) | 2023-03-16 |
Family
ID=85284594
Family Applications (3)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US17/942,018 Abandoned US20230084356A1 (en) | 2021-09-10 | 2022-09-09 | Context Aware Fall Detection Using a Mobile Device |
| US18/617,381 Active US12361810B2 (en) | 2021-09-10 | 2024-03-26 | Context aware fall detection using a mobile device |
| US19/242,847 Pending US20250316155A1 (en) | 2021-09-10 | 2025-06-18 | Context Aware Fall Detection Using a Mobile Device |
Family Applications After (2)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US18/617,381 Active US12361810B2 (en) | 2021-09-10 | 2024-03-26 | Context aware fall detection using a mobile device |
| US19/242,847 Pending US20250316155A1 (en) | 2021-09-10 | 2025-06-18 | Context Aware Fall Detection Using a Mobile Device |
Country Status (4)
| Country | Link |
|---|---|
| US (3) | US20230084356A1 (en) |
| KR (1) | KR20230038121A (en) |
| CN (1) | CN115798143A (en) |
| DE (1) | DE102022209370A1 (en) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US12361810B2 (en) | 2021-09-10 | 2025-07-15 | Apple Inc. | Context aware fall detection using a mobile device |
Citations (22)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20130090083A1 (en) * | 2011-10-07 | 2013-04-11 | Jason Paul DeMont | Personal Assistance Monitoring System |
| US20140378786A1 (en) * | 2013-03-15 | 2014-12-25 | Fitbit, Inc. | Multimode sensor devices |
| US20150061863A1 (en) * | 2013-09-03 | 2015-03-05 | Hti Ip, L.L.C. | Adaptive classification of fall detection for personal emergency response systems |
| US20150145662A1 (en) * | 2013-11-26 | 2015-05-28 | Hti Ip, L.L.C. | Using audio signals in personal emergency response systems |
| US20150199895A1 (en) * | 2012-07-13 | 2015-07-16 | iRezQ AB | Emergency notification within an alarm community |
| US20150213702A1 (en) * | 2014-01-27 | 2015-07-30 | Atlas5D, Inc. | Method and system for behavior detection |
| US20150221202A1 (en) * | 2014-02-04 | 2015-08-06 | Covidien Lp | Preventing falls using posture and movement detection |
| US20150269824A1 (en) * | 2014-03-18 | 2015-09-24 | Jack Ke Zhang | Techniques for emergency detection and emergency alert messaging |
| US20160210838A1 (en) * | 2015-01-16 | 2016-07-21 | City University Of Hong Kong | Monitoring user activity using wearable motion sensing device |
| US20180000385A1 (en) * | 2016-06-17 | 2018-01-04 | Blue Willow Systems Inc. | Method for detecting and responding to falls by residents within a facility |
| US20180279915A1 (en) * | 2015-09-28 | 2018-10-04 | Case Western Reserve University | Wearable and connected gait analytics system |
| US10147296B2 (en) * | 2016-01-12 | 2018-12-04 | Fallcall Solutions, Llc | System for detecting falls and discriminating the severity of falls |
| US10446017B1 (en) * | 2018-12-27 | 2019-10-15 | Daniel Gershoni | Smart personal emergency response systems (SPERS) |
| US20200342735A1 (en) * | 2017-09-29 | 2020-10-29 | Apple Inc. | Detecting Falls Using A Mobile Device |
| US20200409467A1 (en) * | 2019-06-25 | 2020-12-31 | Koninklijke Philips N.V. | Evaluating movement of a subject |
| US20210052198A1 (en) * | 2019-08-20 | 2021-02-25 | Koninklijke Philips N.V. | System and method of detecting falls of a subject using a wearable sensor |
| US20210056828A1 (en) * | 2018-03-09 | 2021-02-25 | Koninklijke Philips N.V. | Method and apparatus for detecting a fall by a user |
| US10978195B2 (en) * | 2014-09-02 | 2021-04-13 | Apple Inc. | Physical activity and workout monitor |
| US11020064B2 (en) * | 2017-05-09 | 2021-06-01 | LifePod Solutions, Inc. | Voice controlled assistance for monitoring adverse events of a user and/or coordinating emergency actions such as caregiver communication |
| US20210166545A1 (en) * | 2019-11-29 | 2021-06-03 | Koninklijke Philips N.V. | Fall detection method and system |
| US11170295B1 (en) * | 2016-09-19 | 2021-11-09 | Tidyware, LLC | Systems and methods for training a personalized machine learning model for fall detection |
| US20220198902A1 (en) * | 2020-12-22 | 2022-06-23 | Micron Technology, Inc. | Emergency assistance response |
Family Cites Families (39)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2006119186A2 (en) | 2005-05-02 | 2006-11-09 | University Of Virginia Patent Foundation | Systems, devices, and methods for interpreting movement |
| US20090040052A1 (en) * | 2007-08-06 | 2009-02-12 | Jeffry Michael Cameron | Assistance alert method and device |
| GB2467514A (en) | 2008-12-23 | 2010-08-04 | Univ Oxford Brookes | Gait monitor for sensing vertical displacement |
| US8805641B2 (en) | 2010-05-18 | 2014-08-12 | Intel-Ge Care Innovations Llc | Wireless sensor based quantitative falls risk assessment |
| CH703381B1 (en) | 2010-06-16 | 2018-12-14 | Myotest Sa | Integrated portable device and method for calculating biomechanical parameters of the stride. |
| US20130218053A1 (en) | 2010-07-09 | 2013-08-22 | The Regents Of The University Of California | System comprised of sensors, communications, processing and inference on servers and other devices |
| US8860570B2 (en) * | 2011-02-03 | 2014-10-14 | SenseTech, LLC | Portable wireless personal head impact reporting system |
| US8784274B1 (en) | 2011-03-18 | 2014-07-22 | Thomas C. Chuang | Athletic performance monitoring with body synchronization analysis |
| US20130023798A1 (en) | 2011-07-20 | 2013-01-24 | Intel-Ge Care Innovations Llc | Method for body-worn sensor based prospective evaluation of falls risk in community-dwelling elderly adults |
| US9165113B2 (en) | 2011-10-27 | 2015-10-20 | Intel-Ge Care Innovations Llc | System and method for quantitative assessment of frailty |
| US11016111B1 (en) | 2012-01-31 | 2021-05-25 | Thomas Chu-Shan Chuang | Stride monitoring |
| US9700241B2 (en) | 2012-12-04 | 2017-07-11 | Under Armour, Inc. | Gait analysis system and method |
| WO2014117252A1 (en) | 2013-02-01 | 2014-08-07 | Trusted Positioning Inc. | Method and system for varying step length estimation using nonlinear system identification |
| US20140343460A1 (en) | 2013-05-15 | 2014-11-20 | Ut-Battelle, Llc | Mobile gait force and motion analysis system |
| EP2997898B1 (en) | 2013-05-17 | 2018-08-22 | Kyocera Corporation | Electronic device, control program, control method, and system |
| CA2924835A1 (en) | 2013-09-19 | 2015-03-26 | Dorsavi Pty Ltd | Method and apparatus for monitoring quality of a dynamic activity of a body |
| US10307086B2 (en) | 2014-02-17 | 2019-06-04 | Hong Kong Baptist University | Gait measurement with 3-axes accelerometer/gyro in mobile devices |
| AU2015237956B2 (en) | 2014-03-25 | 2020-03-26 | Imeasureu Limited | Lower limb loading assessment systems and methods |
| US20160067547A1 (en) | 2014-09-04 | 2016-03-10 | Tagit Labs, Inc. | Methods and systems for automatic adverse event detection and alerting |
| US20160095539A1 (en) | 2014-10-02 | 2016-04-07 | Zikto | Smart band, body balance measuring method of the smart band and computer-readable recording medium comprising program for performing the same |
| US9974478B1 (en) | 2014-12-19 | 2018-05-22 | Great Lakes Neurotechnologies Inc. | Discreet movement measurement and cueing system for improvement of safety and efficacy of movement |
| US10216905B2 (en) | 2015-01-28 | 2019-02-26 | Google Llc | Health state trends for a consistent patient situation |
| US10716495B1 (en) | 2016-03-11 | 2020-07-21 | Fortify Technologies, LLC | Accelerometer-based gait analysis |
| EP3257437A1 (en) | 2016-06-13 | 2017-12-20 | Friedrich-Alexander-Universität Erlangen-Nürnberg | Method and system for analyzing human gait |
| DE102016210505A1 (en) | 2016-06-14 | 2017-03-02 | Robert Bosch Gmbh | System for monitoring a natural athlete and method of operating the system |
| CN106530611A (en) * | 2016-09-28 | 2017-03-22 | 北京奇虎科技有限公司 | Terminal, and method and apparatus of detecting fall of human body |
| US20180235516A1 (en) | 2017-02-17 | 2018-08-23 | Veristride Inc. | Method and System for Determining Step Length |
| CN106875630B (en) * | 2017-03-13 | 2018-12-04 | 中国科学院计算技术研究所 | A kind of wearable fall detection method and system based on hierarchical classification |
| WO2019018371A1 (en) | 2017-07-17 | 2019-01-24 | The University Of North Carolina At Chapel Hill Office Of Technology Commercialization | Methods, systems, and non-transitory computer readable media for assessing lower extremity movement quality |
| US11282361B2 (en) * | 2017-09-29 | 2022-03-22 | Apple Inc. | Detecting falls using a mobile device |
| US10629048B2 (en) * | 2017-09-29 | 2020-04-21 | Apple Inc. | Detecting falls using a mobile device |
| KR20200102805A (en) * | 2019-02-22 | 2020-09-01 | 한국전자통신연구원 | System and method for preventing fall by switching mode |
| EP3796282A3 (en) * | 2019-07-29 | 2021-05-26 | Qolware GmbH | Device, system and method for fall detection |
| CA3152999A1 (en) | 2019-09-06 | 2021-03-11 | University Of Miami | Quantification of symmetry and repeatability in limb motion for treatment of abnormal motion patterns |
| US10842415B1 (en) | 2019-10-25 | 2020-11-24 | Plethy, Inc. | Devices, systems, and methods for monitoring and assessing gait, stability, and/or balance of a user |
| IL298405A (en) | 2020-05-26 | 2023-01-01 | Regeneron Pharma | Gait analysis system |
| US20210393166A1 (en) | 2020-06-23 | 2021-12-23 | Apple Inc. | Monitoring user health using gait analysis |
| CN111887859A (en) * | 2020-08-05 | 2020-11-06 | 安徽华米智能科技有限公司 | Fall behavior recognition method and device, electronic device and medium |
| DE102022209370A1 (en) | 2021-09-10 | 2023-03-16 | Apple Inc. | CONTEXTUAL FALL DETECTION WITH A MOBILE DEVICE |
-
2022
- 2022-09-08 DE DE102022209370.4A patent/DE102022209370A1/en active Pending
- 2022-09-08 KR KR1020220114256A patent/KR20230038121A/en active Pending
- 2022-09-09 US US17/942,018 patent/US20230084356A1/en not_active Abandoned
- 2022-09-09 CN CN202211104264.2A patent/CN115798143A/en active Pending
-
2024
- 2024-03-26 US US18/617,381 patent/US12361810B2/en active Active
-
2025
- 2025-06-18 US US19/242,847 patent/US20250316155A1/en active Pending
Patent Citations (22)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20130090083A1 (en) * | 2011-10-07 | 2013-04-11 | Jason Paul DeMont | Personal Assistance Monitoring System |
| US20150199895A1 (en) * | 2012-07-13 | 2015-07-16 | iRezQ AB | Emergency notification within an alarm community |
| US20140378786A1 (en) * | 2013-03-15 | 2014-12-25 | Fitbit, Inc. | Multimode sensor devices |
| US20150061863A1 (en) * | 2013-09-03 | 2015-03-05 | Hti Ip, L.L.C. | Adaptive classification of fall detection for personal emergency response systems |
| US20150145662A1 (en) * | 2013-11-26 | 2015-05-28 | Hti Ip, L.L.C. | Using audio signals in personal emergency response systems |
| US20150213702A1 (en) * | 2014-01-27 | 2015-07-30 | Atlas5D, Inc. | Method and system for behavior detection |
| US20150221202A1 (en) * | 2014-02-04 | 2015-08-06 | Covidien Lp | Preventing falls using posture and movement detection |
| US20150269824A1 (en) * | 2014-03-18 | 2015-09-24 | Jack Ke Zhang | Techniques for emergency detection and emergency alert messaging |
| US10978195B2 (en) * | 2014-09-02 | 2021-04-13 | Apple Inc. | Physical activity and workout monitor |
| US20160210838A1 (en) * | 2015-01-16 | 2016-07-21 | City University Of Hong Kong | Monitoring user activity using wearable motion sensing device |
| US20180279915A1 (en) * | 2015-09-28 | 2018-10-04 | Case Western Reserve University | Wearable and connected gait analytics system |
| US10147296B2 (en) * | 2016-01-12 | 2018-12-04 | Fallcall Solutions, Llc | System for detecting falls and discriminating the severity of falls |
| US20180000385A1 (en) * | 2016-06-17 | 2018-01-04 | Blue Willow Systems Inc. | Method for detecting and responding to falls by residents within a facility |
| US11170295B1 (en) * | 2016-09-19 | 2021-11-09 | Tidyware, LLC | Systems and methods for training a personalized machine learning model for fall detection |
| US11020064B2 (en) * | 2017-05-09 | 2021-06-01 | LifePod Solutions, Inc. | Voice controlled assistance for monitoring adverse events of a user and/or coordinating emergency actions such as caregiver communication |
| US20200342735A1 (en) * | 2017-09-29 | 2020-10-29 | Apple Inc. | Detecting Falls Using A Mobile Device |
| US20210056828A1 (en) * | 2018-03-09 | 2021-02-25 | Koninklijke Philips N.V. | Method and apparatus for detecting a fall by a user |
| US10446017B1 (en) * | 2018-12-27 | 2019-10-15 | Daniel Gershoni | Smart personal emergency response systems (SPERS) |
| US20200409467A1 (en) * | 2019-06-25 | 2020-12-31 | Koninklijke Philips N.V. | Evaluating movement of a subject |
| US20210052198A1 (en) * | 2019-08-20 | 2021-02-25 | Koninklijke Philips N.V. | System and method of detecting falls of a subject using a wearable sensor |
| US20210166545A1 (en) * | 2019-11-29 | 2021-06-03 | Koninklijke Philips N.V. | Fall detection method and system |
| US20220198902A1 (en) * | 2020-12-22 | 2022-06-23 | Micron Technology, Inc. | Emergency assistance response |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US12361810B2 (en) | 2021-09-10 | 2025-07-15 | Apple Inc. | Context aware fall detection using a mobile device |
Also Published As
| Publication number | Publication date |
|---|---|
| US12361810B2 (en) | 2025-07-15 |
| US20240233507A1 (en) | 2024-07-11 |
| US20250316155A1 (en) | 2025-10-09 |
| DE102022209370A1 (en) | 2023-03-16 |
| KR20230038121A (en) | 2023-03-17 |
| CN115798143A (en) | 2023-03-14 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP7261284B2 (en) | Fall detection using mobile devices | |
| US12380789B2 (en) | Detecting falls using a mobile device | |
| US11282361B2 (en) | Detecting falls using a mobile device | |
| US20240315601A1 (en) | Monitoring user health using gait analysis | |
| US11282362B2 (en) | Detecting falls using a mobile device | |
| US11282363B2 (en) | Detecting falls using a mobile device | |
| US20250316155A1 (en) | Context Aware Fall Detection Using a Mobile Device | |
| US10024876B2 (en) | Pedestrian velocity estimation | |
| CN113936420B (en) | Detecting falls using a mobile device | |
| US20220095954A1 (en) | A foot mounted wearable device and a method to operate the same |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: APPLE INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:VENKATESWARAN, SRIRAM;DEHLEH HOSSEIN ZADEH, PARISA;MAJJIGI, VINAY R.;AND OTHERS;SIGNING DATES FROM 20220907 TO 20220919;REEL/FRAME:061137/0349 |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
| STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO PAY ISSUE FEE |