CN113126752A - Method for controlling a display - Google Patents
Method for controlling a display Download PDFInfo
- Publication number
- CN113126752A CN113126752A CN202011616222.8A CN202011616222A CN113126752A CN 113126752 A CN113126752 A CN 113126752A CN 202011616222 A CN202011616222 A CN 202011616222A CN 113126752 A CN113126752 A CN 113126752A
- Authority
- CN
- China
- Prior art keywords
- sensor data
- display
- user
- activity type
- activity
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/014—Hand-worn input/output arrangements, e.g. data gloves
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/147—Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/24—Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
- A61B5/316—Modalities, i.e. specific diagnostic methods
- A61B5/318—Heart-related electrical modalities, e.g. electrocardiography [ECG]
- A61B5/339—Displays specially adapted therefor
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
- A61B5/1118—Determining activity level
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
- A61B5/1123—Discriminating type of movement, e.g. walking or running
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6802—Sensor mounted on worn items
- A61B5/681—Wristwatch-type devices
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B24/00—Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/163—Wearable computers, e.g. on a belt
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/26—Power supply means, e.g. regulation thereof
- G06F1/32—Means for saving power
- G06F1/3203—Power management, i.e. event-based initiation of a power-saving mode
- G06F1/3206—Monitoring of events, devices or parameters that trigger a change in power modality
- G06F1/3231—Monitoring the presence, absence or movement of users
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/26—Power supply means, e.g. regulation thereof
- G06F1/32—Means for saving power
- G06F1/3203—Power management, i.e. event-based initiation of a power-saving mode
- G06F1/3234—Power saving characterised by the action undertaken
- G06F1/325—Power saving in peripheral device
- G06F1/3265—Power saving in display device
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/1407—General aspects irrespective of display type, e.g. determination of decimal point position, display with fixed or driving decimal point, suppression of non-significant zeros
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/23—Recognition of whole body movements, e.g. for sport training
- G06V40/25—Recognition of walking or running movements, e.g. gait recognition
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72448—User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
- H04M1/72454—User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2503/00—Evaluating a particular growth phase or type of persons or animals
- A61B2503/10—Athletes
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2370/00—Aspects of data communication
- G09G2370/02—Networking aspects
- G09G2370/022—Centralised management of display operation, e.g. in a server instead of locally
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2380/00—Specific applications
- G09G2380/08—Biomedical applications
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D10/00—Energy efficient computing, e.g. low power processors, power management or thermal management
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Animal Behavior & Ethology (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Pathology (AREA)
- Biophysics (AREA)
- Veterinary Medicine (AREA)
- Computer Hardware Design (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Dentistry (AREA)
- Physiology (AREA)
- Signal Processing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Environmental & Geological Engineering (AREA)
- Cardiology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Psychiatry (AREA)
- Social Psychology (AREA)
- Multimedia (AREA)
- Physical Education & Sports Medicine (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
- User Interface Of Digital Computer (AREA)
- Controls And Circuits For Display Device (AREA)
- Control Of Indicators Other Than Cathode Ray Tubes (AREA)
Abstract
A method for controlling a display, and an apparatus are presented. The apparatus comprises: at least one processor; and at least one memory including computer program code, the at least one memory and the computer program code configured to, with the at least one processor, cause the apparatus at least to perform: receiving an activity type of a user; receiving sensor data; determining at least one measurement value based on the sensor data; detecting at least one first activity type specific change in at least one measurement value; and activating the display in response to detecting at least one first activity type specific change in the at least one measurement.
Description
Technical Field
The present invention generally relates to controlling (e.g., activating and deactivating) a display of a device.
Background
Today, many people carry wearable devices, such as activity trackers or smart watches. The display of the device may consume excessive battery power in vain if the display of the device is unnecessarily active.
Disclosure of Invention
According to some aspects, the subject matter of the independent claims is presented. Some embodiments are defined in the dependent claims. The scope of protection of various embodiments is set forth by the independent claims. Embodiments, examples and features (if any) described herein that do not fall within the scope of the independent claims should be construed as examples useful for understanding the various embodiments.
According to a first aspect of the invention, there is provided an apparatus comprising: at least one processor; and at least one memory including computer program code, the at least one memory and the computer program code configured to, with the at least one processor, cause the apparatus at least to perform: receiving an activity type of a user; receiving sensor data; determining at least one measurement value based on the sensor data; detecting at least one first activity type specific change in at least one measurement value; and activating the display in response to detecting at least one first activity type specific change in the at least one measurement.
According to one embodiment, detecting at least one first activity type specific change in the at least one measurement value comprises: detecting a change in the measurement; and comparing the detected change with an activity type specific reference value.
According to one embodiment, the activity type specific changes are pre-learned.
According to one embodiment, the apparatus is capable of performing: receiving one or more user-specific display activation rules; and updating the activity-type specific reference value in accordance with the one or more user-specific display activation rules.
According to one embodiment, the apparatus is capable of performing: receiving one or more user-specific display activation rules; and activating the display further in accordance with the one or more user-specific display activation rules.
According to one embodiment, the apparatus is capable of performing: detecting a second activity-type specific change in the measurement values; and deactivating the display in response to detecting the second activity type-specific change in the measurement.
According to one embodiment, the activity type is athletic activity, daily use, or sleep.
According to an embodiment, the sensor data comprises one or more of acceleration sensor data, gyroscope data, altimeter data, heart rate sensor data, ambient light sensor data and position sensor data.
According to an embodiment, the at least one measurement is one or more of acceleration, velocity, distance, inclination, altitude, heart rate, ambient light and position.
According to a second aspect of the invention, there is provided a method comprising: receiving an activity type of a user; receiving sensor data; determining at least one measurement value based on the sensor data; detecting at least one first activity type specific change in at least one measurement value; and activating the display in response to detecting at least one first activity type specific change in the at least one measurement.
Many embodiments of the second aspect may include at least one feature from the following list:
-detecting at least one first activity type specific change in the at least one measurement value comprises: detecting a change in the measurement; and comparing the detected change with an activity-type specific reference value
-activity type specific changes are pre-learned
-the method further comprises: receiving one or more user-specific display activation rules; and updating the activity-type specific reference value in accordance with the one or more user-specific display activation rules
-the method further comprises: receiving one or more user-specific display activation rules; and activating the display further according to the one or more user-specific display activation rules
-the method further comprises: detecting a second activity-type specific change in the measurement values; and deactivating the display in response to detecting a second activity type specific change in the measurement value
The type of activity is athletic activity, daily use or sleep
-the sensor data comprises one or more of acceleration sensor data, gyroscope data, altimeter data, heart rate sensor data, ambient light sensor data and position sensor data
-the at least one measurement value is one or more of acceleration, velocity, distance, inclination, altitude, heart rate, ambient light and position.
According to a third aspect of the invention, there is provided a non-transitory computer readable medium comprising program instructions which, when executed by at least one processor, enable an apparatus to at least perform: receiving an activity type of a user; receiving sensor data; determining at least one measurement value based on the sensor data; detecting at least one first activity type specific change in at least one measurement value; and activating the display in response to detecting at least one first activity type specific change in the at least one measurement.
According to a fourth aspect of the present invention, there is provided a computer program configured to be able to perform the method according to the second aspect.
According to a fifth aspect of the present invention there is provided an apparatus comprising means for performing the method according to the second aspect. The mechanism comprises: at least one processor; and at least one memory including computer program code, the at least one memory and the computer program code configured to, with the at least one processor, cause the apparatus to operate.
Drawings
FIG. 1 shows an exemplary system;
FIG. 2A illustrates one embodiment of a plurality of sequences of sensor data elements;
FIG. 2B shows a second embodiment of a plurality of sequences of sensor data elements;
FIG. 3 shows a block diagram of an exemplary device;
FIG. 4 shows a flow diagram of an exemplary method for activating a display;
FIG. 5A shows, as an example, a flow diagram of display activation and deactivation in relation to activity types;
FIG. 5B shows a flowchart of an exemplary activity type-dependent display activation and deactivation;
FIG. 6 shows a block diagram of an exemplary device; and is
Fig. 7 shows a block diagram of an exemplary device.
Detailed Description
Fig. 1 shows a system 100 as an example. The system includes a device 110, which may include, for example, a smart watch, electronic watch, smart phone, tablet device, or other type of suitable device. Device 110 may include a display, which may include a touch screen, for example. The size of the display may be limited. The device 110 may be powered by a rechargeable battery, for example. One example of a display of limited size is a display worn on the wrist.
The device 110 may be configured to receive satellite positioning information from a constellation of satellites 150 via a satellite link 151. The satellite constellation may include, for example, the Global Positioning System (GPS) or galileo constellation. Although only one satellite is shown in fig. 1 for clarity, satellite constellation 150 may include more than one satellite. Similarly, receiving positioning information over satellite link 151 may include receiving data from more than one satellite.
Instead of, or in addition to, receiving data from a constellation of satellites, device 110 may obtain positioning information by interacting with a network that includes base stations 120. For example, a cellular network may locate devices in various ways, such as trilateration, multilateration, or location based on identification of base stations that may be or are connected. Similarly, a non-cellular base station or access point may know its own location and provide it to device 110, thereby enabling device 110 to position itself within a communication range of the access point.
For example, device 110 may be configured to obtain the current time from satellite constellation 150, base station 120, or by a user requesting the current time. When device 110 has an estimate of the current time and its location, device 110 may, for example, consult a look-up table to determine the time remaining until, for example, a sunset or sunrise. Similarly, the device 110 may acquire knowledge of the time of year.
The acceleration sensor or motion sensor may comprise, for example, an Inertial Measurement Unit (IMU) of 6 degrees of freedom (DoF) or 9 DoF. The acceleration sensors may include, for example, a 3D digital accelerometer and a 3D digital gyroscope. Full scale acceleration ranges of + -2/+ -4/+ -8/+ -16 g and angular rates of + -125/+ -250/+ -500/+ -1000/+ -2000/+ -4000 degrees per second (dps) can be supported. The acceleration sensor may comprise a 3D magnetometer.
The device 110 may be configured to provide an active session. The activity session may be associated with an activity type. The activity type may be a sporting activity. Examples of activity types include rowing, riding, jogging, walking, hunting, swimming, and paraglider. In a simple form, an active session may include device 110 storing sensor data generated by a sensor included in device 110 or a sensor included in another device associated with or paired with device 110. It may be determined that the active session has started and ended at some point in time such that the determination occurs after or simultaneously with the starting and/or ending. In other words, device 110 may store sensor data to enable subsequent identification of an active session based at least in part on the stored sensor data.
The activity type may be determined based at least in part on the sensor data. This determination may be made at or after the occurrence of the activity, at the time of analyzing the sensor data. For example, the activity type may be determined by device 110 or a personal computer that may access the sensor data, or a server that may access the sensor data. In the case where the server has access to the sensor data, the sensor data may be anonymized. The determination of the type of activity may include comparing the sensor data to reference data. The reference data may include reference data sets, each reference data set associated with an activity type. The determination may comprise determining a reference data set that is most similar to the sensor data, e.g. in a least squares sense. Instead of the sensor data itself, a processed form of the sensor data may be compared with the reference data. The processed form may include, for example, a spectrum obtained from the sensor data. Alternatively, the processed form may include a set of local minima and/or maxima from the sensor data time series. The determined activity type may be selected as the activity type associated with the reference data set that is most similar to the processed or raw sensor data.
Different activity types may be associated with different characteristic frequencies. For example, when the user is already running, rather than walking, the acceleration sensor data may reflect a higher characteristic frequency. Thus, in some embodiments, the determination of the type of activity may be based at least in part on determining which reference data set has a characteristic frequency that most closely matches a characteristic frequency of a portion of the time series of sensor-derived information under study. Alternatively or additionally, acceleration sensor data may be employed to determine a characteristic motion amplitude.
Where device 110 is configured to store a time series of more than one type of sensor data, multiple sensor data types may be employed in determining the activity type. The reference data may comprise a reference data set that is multi-sensory in nature by: each reference data set includes data that can be compared to each sensor data type available. For example, where the device 110 is configured to compile a time series of acceleration and sound sensor data types, the reference data may include a plurality of reference data sets, each corresponding to an activity type, where each reference data set includes data that may be compared to the acceleration data and data that may be compared to the sound data. The determined activity type may be determined as the activity type associated with the multi-sensory reference data set that most closely matches the sensor data stored by the device 110. In addition, raw or processed sensor data may be compared to a reference data set. Where the device 110 comprises a smartphone, for example, the device may include a plurality of sensors to accomplish smartphone functions. Examples of such sensors include a microphone capable of making voice calls, and a camera capable of making video calls. Further, in some cases, the radio receiver may be configured to measure electric or magnetic field characteristics. In general, the device 110 may include a radio receiver, wherein the device 110 has wireless communication capabilities.
A first example of multi-sensory activity type determination is hunting, where the device 110 stores a first type of sensor data including acceleration sensor data and a second type of sensor data including voice data. The reference data will include a hunting reference data set that includes acceleration reference data and acoustic reference data to enable comparison with sensor data stored by the device 110. Hunting may involve phases of low sound and low acceleration, as well as intermittent combinations of high sound, short sound and low amplitude high frequency acceleration corresponding to gunshot and kicking.
A second example of multi-sensory activity type determination is swimming, where device 110 stores a first type of sensor data that includes humidity sensor data and a second type of sensor data that includes magnetic field data from a compass sensor. The reference data will comprise a swim reference data set comprising wetness reference data and magnetic field reference data to enable comparison with sensor data stored by the device 110. Swimming may involve high humidity caused by immersion in water, and elliptical motion of the arm to which the device 110 is attached, which motion may be detected as periodically varying magnetic field data. In other words, the direction of the earth's magnetic field may periodically change in a time sequence due to the point of view of the magnetic field compass sensor.
In general, a determined or derived activity type may be considered an estimated activity type until the user confirms that the determination is correct. In some embodiments, some (e.g., two or three) of the most likely activity types may be provided to the user as estimated activity types for the user to select from. Using two or more types of sensor data may increase the likelihood that the estimated type of activity is correct.
Contextual processing may be employed in deriving the estimated activity type based on the sensor data. The context processing may include first determining a context in which the sensor data has been generated. For example, context processing may include using sensor data to determine a context, such as a user context, and then deriving an activity type within the context. For example, the context may include outdoor activities, and deriving the estimated activity type may include: first determining that a user is in an outdoor context based on sensor data; selecting outdoor context machine-readable instructions; and using the machine readable instructions to distinguish between different types of outdoor contextual activities (e.g., jogging and directed cross country). As another example, the context may include indoor activity, and deriving the estimated activity type may include: first determining that a user is in an indoor context based on sensor data; selecting indoor context machine-readable instructions; and using the machine readable instructions to distinguish between different types of indoor activity (e.g., 100 meter running and wrestling).
The machine-readable instructions may include, for example, scripts (e.g., executable or compilable scripts), executable computer programs, software plug-ins, or non-executable computer-readable descriptors that enable device 110 to distinguish at least two activity types in a determined context. The machine readable instructions may include an indication of which type or types of sensor data and which format to use when using the machine readable instructions to derive the activity type.
Determining an outdoor environment may include determining that the sensor data indicates a wide range of geographic motion, indicating that the user has roamed outdoors. Determining an indoor environment may include determining that the sensor data indicates a narrow range of geographic motion indicating that the user remains within the narrow range during the active session. Where temperature type sensor data is available, lower temperatures may be associated with outdoor activities, while higher temperatures may be associated with indoor activities. The temperature may be indicated, inter alia, in a geographical area where a user is in a winter, autumn or spring condition resulting in an outdoor temperature being lower than an indoor temperature. The geographical area may be found in the positioning data.
Thus, in some embodiments, deriving the estimated activity type is a two-stage process, including: a context is first determined based on the sensor data, and then an estimated activity type is derived in the context using machine-readable instructions specific to the context. Selecting a context and/or a type of activity within a context may include comparing sensor data or processed sensor data to reference data. The two-phase process may employ two types of reference data, namely context-type reference data and activity-type reference data, respectively.
The sensor data may be processed into a series of tags to determine the type of activity. The tag sequence may characterize the content of the sensor data. For example, where the sensor data elements are a plurality of values obtained during a run, the sequence of tags derived from those sensor data elements may include a series of tags: { jogging step, … }. Similarly, where the sensor data elements are a plurality of values obtained during a long jump, the sequence of tags derived from the sensor data elements may comprise a series of tags: { dash take a step, jump, stop }. Similarly, where the sensor data elements are a plurality of values obtained during a three-level jump, the sequence of tags derived from the sensor data elements may include a series of tags: { dash take a step, jump, stop }. Thus, the tag sequence can be used to identify the type of activity, for example, to distinguish a long jump from a triple jump based on the number of jumps.
The tags may be expressed in natural language or as an index to a predefined table that may be dynamically updated as new exercise primitive categories are known. For example, in a predefined table, a jogging step may be denoted 01, a sprint step (i.e., a running step that is much faster than jogging) may be denoted 02, a jump may be denoted 03, and a stopped motion may be denoted 04. Thus, a tertiary hop can be represented as a sequence of labels {02, 02, 02, 02, 03, 03, 03, 04 }. Activity (e.g., three hops) can be detected from the tags, while the sequence of tags takes up much less space than the original sequence of sensor data elements.
To process the sequence of sensor data elements into a sequence of tags, a sensor data segment may be derived from the sequence of sensor data elements. Each piece of sensor data may then be associated with a motion primitive and assigned a tag to obtain a sequence of tags. Each sensor data segment may include a time-aligned sub-sequence of sensor data elements from at least two of the sequences of sensor data elements. In other words, the sensor data segments are derived such that each such segment comprises a time slice of the original sequence of sensor data elements. This can be conceptualized as cutting the multi-sensor data stream acquired during jogging into individual steps that make up the jogging session in time. Similarly, other activity sessions may also be time-sliced into the exercise primitives that make up the activity.
To derive the segments, the device 110 or another device may be configured to analyze the sequence of sensor data elements to identify units therein. Each segment may comprise slices of the sequence of sensor data elements that are time aligned, i.e. obtained simultaneously from the respective sensors.
For example, the running step is repetitive in nature, so identifying a pattern in a sequence of sensor data elements that repeats at a frequency from which the sequence can be segmented is a clue. The frequency may be identified, for example, by: a Fast Fourier Transform (FFT) is performed on each sequence of sensor data elements and then the resulting frequency spectrum is averaged to obtain the overall frequency characteristic of the sequence of sensor data elements.
In the case of motion, one method of segmenting the sensor data is to attempt to construct the relative trajectory of the sensor devices. One way to estimate the trajectory is to integrate the x, y and z components of the acceleration sensor output twice. In this process, the deviation caused by gravity can be eliminated. Mathematically, this can be done by calculating a baseline (baseline) for each output. One method is to filter the data according to the following equation.
acc_i_baseline=acc_i_baseline+coeff_a*(acc_i–acc_i_baseline)
The above acc refers to the acceleration measurement and i refers to its components x, y and z. These filtered values may be subtracted from the actual measured values: acc _ i _ without _ G ═ acc _ i-acc _ i _ baseline. This is a rough estimate of true linear acceleration and still a fast and reliable estimation method. Integration of these linear acceleration values results in an estimate of the velocity of the sensor device in three-dimensional (3D) space. Since the estimation of the linear acceleration is incomplete, the velocity component is biased. These deviations can be eliminated like the previous equation:
v_i_baseline=v_i_baseline+coeff_v*(v_i–v_i_baseline)
the upper v refers to the velocity estimate and i refers to its components x, y and z. These velocity components are not the true velocity of the sensor device, but rather their estimates are easily and reliably calculated. Prior to integration, the baseline component may be subtracted from the velocity estimate: v _ i _ wo _ bias ═ v _ i-v _ i _ baseline. Since this method is currently imperfect, the integration of the velocity components results in biased position estimates p _ x, p _ y, and p _ z. Therefore, it is necessary to eliminate these deviations as in the previous equation:
p_i_baseline=p_i_baseline+coeff_p*(p_i–p_i_baseline)
the upper p refers to the position estimate, and i refers to its components. Since this process effectively produces a 0-mean, the natural references for position are p _ x _ ref to 0, p _ y _ ref to 0 and p _ z _ ref to 0. The measured euclidean distances sqrt (p _ x _ ti x 2+ p _ y _ ti x 2+ p _ z _ ti x 2) form a time series from 0 to some maximum. Ti refers to an index in the time series. These maxima are easily detectable. A segment starts at the moment of the maximum and ends at the next maximum (and starts the next segment). The detection of the maximum value may be conditional, i.e. the maximum value is accepted as start/stop flag only when it exceeds a certain level.
Furthermore, by using a gyroscope and, for example, using complementary filtering, the above-described process of calculating relative trajectories may be more accurate.
Other methods for segmenting data (i.e., deriving segments) may include: for example using a suitably trained artificial neural network or using a separate segmented signal provided over a radio or wired interface to fit to the periodic model. The segmentation signal may be temporally associated with a sequence of sensor data elements to obtain a segmentation. The segmented signal may be transmitted or provided by, for example, a video recognition system or a pressure pad system. For example, such a video recognition system may be configured to recognize a step.
Once the segments are derived, each segment may be assigned a label. Assigning the tag may include identifying the segment. The identifying may include: comparing the sensor data contained in the segment with a library of reference segments, for example in the least squares sense; and selecting a reference fragment from the library of reference fragments that is most similar to the fragment to be tagged. The tag assigned to that fragment will become the tag associated with the most similar reference fragment in the library of reference fragments.
In some embodiments, multiple reference fragment libraries are used, such that the first stage of identification is the selection of a reference fragment library. For example, where two reference fragment libraries are used, one of the reference fragment libraries may be for a continuous activity type and the other may be for a discontinuous activity type. Where the sequence of sensor data elements reflects a repetitive motion that repeats a large number of times (e.g., jogging, walking, cycling, or rowing), the continuous activity type is selected. Where the activity is characterized by brief sequences of actions separated in time from one another (e.g., the three-step jump or pole jump described above), a non-continuous activity type is selected. After selecting the library of reference fragments, all fragments will be tagged with the tags in the selected library of reference fragments.
The advantage of selecting the reference fragment library first is that it can be tagged more efficiently, since there is less risk of erroneously assigning tags to fragments. In this way, the chance of selecting the correct reference segment is increased, since the number of reference segments to compare with the sensor data segments is smaller.
Once the fragments have been tagged, a syntax check can be performed, wherein tag sequences can be evaluated if they are meaningful. For example, if the tag sequence is consistent with a known activity type, then the syntax check will be passed. On the other hand, if the sequence of tags includes tags that do not fit together, a syntax error may be generated. As an example, a sequence of jogging strides that includes some rowing actions will produce a grammatical error because the user is virtually unable to jog and rowing at the same time. In some embodiments, if the number of tags that do not fit is very small, e.g., less than 2%, in the tag sequence, the syntax error can be resolved by removing the non-fit tags from the tag sequence.
The reference fragment library may include an indication of which tags fit together to enable handling of syntax error conditions.
Different motion primitives may be associated with different characteristic frequencies. For example, when the user is running rather than walking, the acceleration sensor data may reflect a higher characteristic frequency. Thus, in some embodiments, the tagging of segments may be based at least in part on determining which reference segment has a characteristic frequency that most closely matches a characteristic frequency of a portion of the sequence of sensor data elements under study. Alternatively or additionally, acceleration sensor data may be employed to determine a characteristic motion amplitude.
The reference fragment library may include reference datasets that are multisensory in nature as follows: each reference segment includes data that can be compared to each type of sensor data available. For example, where the device 110 is configured to compile a time series of acceleration and sound sensor data types, the reference segments may include a set of reference data, each reference segment corresponding to a tag, where each reference segment includes, for example, data that may be compared to acceleration data and data that may be compared to sound data. The determined tag may be determined as the tag associated with the multi-sensory reference segment that most closely matches the segment stored by the device 110. The device 110 may include, for example, a microphone and a camera. Further, in some cases, the radio receiver may be configured to measure electric or magnetic field characteristics. In general, the device 110 may include a radio receiver, wherein the device 110 has wireless communication capabilities.
One example of identifying the type of activity by segmentation and tagging is swimming, where device 110 stores a sequence of sensor data elements, including a humidity sensor data element and a magnetic field sensor data element. A humidity sensor data element indicating the presence of water will result in the use of a pool of water sport reference segments. Swimming may involve an elliptical motion of an arm to which the device 110 may be attached, which may be detected as periodically varying magnetic field data. In other words, the direction of the earth's magnetic field may periodically change in time series due to the viewpoint of the magnetic field sensor. This will cause the segment to be marked as a breaststroke swimming stroke, for example.
In general, a determined or derived activity type may be considered an estimated activity type until the user confirms that the determination is correct. In some embodiments, some (e.g., two or three) of the most likely activity types may be presented to the user as estimated activity types for the user to select from. Using two or more types of sensor data may increase the likelihood that the estimated type of activity is correct. Once the user confirms or selects a particular activity type, the segments may be tagged to conform to that activity type. For example, this may mean that the set of reference fragments to which the sensor data fragments are compared is limited to only reference data fragments that are consistent with this activity type.
Where the device 110 or personal device assigns tags, the sequence of tags may be sent to a network server, for example, for storage. The device 110, personal device, or server may determine the overall type of activity in which the user is engaged based on the tags. This may be based, for example, on a library of reference marker sequences.
In general, device 110 or a personal device may receive machine-readable instructions, such as an executable program or an executable script, from a server or another network entity. The machine-readable instructions may be for determining the activity type from a sequence of tags and/or for assigning tags to sensor data segments. In the latter case, the machine-readable instructions may be referred to as marker instructions.
The process may adaptively learn how to more accurately assign tags and/or determine activity types based on machine-readable instructions. The server may have the authority to obtain information from multiple users and may have high processing capabilities, for example, to enable it to be used more advantageously than the device 110 to update machine-readable instructions.
The machine-readable instructions may be adapted by a server. For example, in response to a message sent from device 110, the user who originally obtained device 110 may first be provided with machine-readable instructions reflecting the average number of users. Thereafter, as the user engages in the active session, the machine-readable instructions may be adapted to more accurately reflect the usage of that particular user. For example, limb length may affect the periodic nature of sensor data captured while the user is swimming or running. To enable adaptation, the server may, for example, periodically request sensor data from the device 110 and compare the sensor data so obtained with machine-readable instructions to train the instructions for future use by the particular user. Thus, the resulting benefits are that fewer errors are made in the marked segments and that more efficient and accurate compression of the sensor data is possible.
FIG. 2A illustrates one example of a plurality of sequences of sensor data elements. On the upper axis 201, a sequence 210 of humidity sensor data elements is shown, while on the lower axis 202, a time sequence 220 of the deviation of the magnetic north pole with respect to the axis of the device 110, i.e. a sequence of magnetic sensor data elements, is shown.
The humidity sequence 210 shows an initial portion of low humidity followed by a rapid increase in humidity, then held at a relatively constant elevated level, and thereafter begins to decrease at a rate less than the growth as the device 110 dries.
The magnetic bias sequence 220 shows an initial, irregular sequence of bias variations, for example due to a user moving while operating the locker of the locker room, followed by a substantially periodic movement period, and then the irregular sequence is started again. The wavelength of the periodically repeating motion is exaggerated in fig. 2A to make the illustration clearer.
The swimming activity type may be determined as the estimated activity type, starting at point 203 and ending at point 205 of the sequence. In detail, the sequence can be divided into two segments, the first from point 203 to point 204 and the second from point 204 to point 205. When the moisture sensor indicates water sports, the segments are tagged as freestyle segments, for example using a pool of water sports reference segments. Thus, the sequence of tags may be { freestyle stroke, freestyle stroke }. Of course, in actual swimming, the number of segments would be much higher, but for simplicity only two segments are shown in fig. 2A. In general, both sensor data segments from 203 to 204 and from 204 to 205 include sub-sequences of time-aligned sensor data elements from sequences 210 and 220.
Fig. 2B shows a second example of a plurality of sequences of sensor data elements. In fig. 2B, the same reference numerals denote the same elements as in fig. 2A. Unlike in fig. 2A, the active sessions determined in the time series of fig. 2B are not one, but two. That is, it is determined that the cycling session begins at the starting point 207 and ends at the point 203, at which point the swimming session begins. Thus, the composite activity session may be related to triathlon, for example. During riding, the humidity remains low and the magnetic deviation changes only slowly, for example due to the user riding in a racing car park. Thus, these segments will include two segments between points 207 and 203, and three segments between points 203 and 205. The sequence of the tags may be { riding, freestyle stroke }. In addition, the number of segments is significantly reduced for clarity of illustration.
It may depend at least on the type of activity the user is interested in viewing information displayed on the display of the device. A situation in which the user wishes to see the display content may be associated with a change in the received sensor data (e.g., acceleration sensor data and/or altimeter data).
Riding is considered as an activity type. The device may recognize that the user is riding or the user may provide the type of activity as input. For example, the ride may be identified based on acceleration sensor data and/or velocity data (e.g., derived from acceleration sensor data or derived from positioning data such as GPS data). While riding, the user may be interested in viewing information such as heart rate displayed on the display while riding uphill. The changing altitude may be detected based on sensor data such as altimeter data. The display may be activated in response to detecting a predetermined change in sensor data, such as at the altitude of the user. On the other hand, on a flat road, the user is not necessarily interested in viewing the display, which may be deactivated to save power. A device that includes a display and/or sensors (e.g., acceleration sensors) need not be worn on the user's wrist. The device may be attached to a user's bicycle (e.g., a handlebar of a bicycle).
For example, the predetermined change may be detected by comparing the sensor data or a measurement value determined based on the sensor data with a threshold value. The threshold value may be a predetermined reference value. For example, if the measured quantity is height, the measured value is a height value, and if there is a small change in height (e.g., 2m), then the heart rate may not change much, and the user may not want to see the heart rate. However, if the change in height is above a predetermined threshold (e.g., 10m), the display may activate. The change may be defined as a relative change, for example compared to an average height of a predetermined distance before the measurement time point. As another example, the threshold may be set to a time during which the height continuously increases. It may be provided that the display is active, for example, with a height rise of 5 seconds.
The rules for activating the display may be predetermined, for example. Alternatively, the system may adapt to and learn about the situation in which the user wants to see the display. For example, a user may provide an input to activate the display, such as through a button or touch screen. The system may detect a beginning of an uphill grade based on sensor data (e.g., altimeter data) and increase in altitude within a predetermined time. During uphill riding, a user may provide an input to activate the display. The system may detect that the user has activated the display multiple times in the same situation (i.e., during uphill riding). A threshold may be set for the number of times the system must detect the same behavior in order to learn the rules that activate the display. For example, if the same behavior is repeated, e.g., 3 times, the system may learn that the display will be activated in these cases. Thus, specific changes in activity types may be learned in advance.
There may also be user specific rules for activating the display. For example, one user (e.g., the first user) may want to view the display during an uphill slope as described above. Another user (e.g., a second user) may wish to view the display at the end of an uphill slope. If the system detects that the user has activated the display a predetermined number of times by providing an input when the uphill slope has ended, the system may learn user-specific rules to activate the display. The end of an uphill slope may be detected based on sensor data (e.g. altimeter data) no longer changing or starting to decline (if a downhill slope occurs after the uphill slope).
Swimming is considered as an activity type. The device may recognize that the user is swimming, or the user may provide the type of activity as input. When the user is swimming, the user may not be able to view a display, such as a display on a wrist-worn device. When the user stops swimming, the user may be interested in viewing the display. It may be detected that the user has stopped based on sensor data (e.g., acceleration sensor data and/or humidity sensor data). In response to detecting a predetermined change in sensor data (e.g., the repetitive elliptical motion of the hand has ended, or the velocity has become approximately zero), the display may be activated. During swimming, the display may be deactivated.
The rules for activating the display may be predetermined or learned by the system, for example. For example, the user may provide an input to activate a display located at the end of the track after swimming has stopped. The system may detect that the user has stopped swimming based on sensor data (e.g., acceleration data). The system may detect that the user has activated the display multiple times in the same situation (i.e. after a swimming session). A threshold may be set for the number of times the system must detect the same behavior in order to learn the rules for activating the display. For example, if the same behavior is repeated, e.g., 3 times, the system may learn to activate the display in these cases. Thus, specific changes in activity types may be learned in advance.
User-specific activation rules may be set for swimming. For example, the first user may wish to view the display after a predetermined swimming distance (e.g., after every 100 m). The user may wish to quickly view the display without actually stopping swimming, and therefore the rules based on the acceleration sensor data and swimming stops do not necessarily meet the first user's needs. In this example, the predetermined change in the measurement value corresponds to achieving a certain distance (i.e., a change in distance that exceeds a predetermined threshold). The threshold value may be a predetermined reference value. For example, the display may be activated for a predetermined period of time every 100 m. The second user may wish to quickly view the display after every 200m without actually stopping. These different rules may be learned by the system and stored as part of the user profile, for example, in a database such as a cloud-based database. The swimming distance may be calculated based on the turn at the end of the pool. The turn may be detected based on the acceleration sensor data. The length of the pool can be known. GPS data can also be used to determine swimming distance.
Cross-country skiing is considered as an active type. The device may recognize that the user is skiing or the user may provide the type of activity as input. When a user is engaged on an uphill cross-country ski slope, the user may not wish to view the display on the wrist-worn device because it may be difficult to maintain the rhythm of the cane. On the other hand, it is easier to check the display when going downhill. Thus, it may be detected that the skier is skiing downhill based on sensor data (e.g., acceleration sensor data and/or altimeter data). The skier may, for example, be in a typical downhill skiing posture, i.e. in a conglomerate posture. The display may be activated in response to a predetermined change in the sensor data. The change in sensor data may be, for example, a typical hand movement for skiing stopped, and/or a decrease in height. During snowing on an uphill slope, the display may be deactivated to save power.
The rules for activating the display may be predetermined or learned by the system, for example. For example, the user may provide an input to activate the display after taking a clique gesture for a downhill slope. The system may detect that the user has assumed a clique pose for a downhill slope based on the sensor data. The system may detect that the user has activated the display multiple times under the same circumstances. A threshold may be set for the number of times the system must detect the same behavior in order to learn the rules that activate the display. For example, if the same behavior is repeated, e.g., 3 times, the system may learn that the display will be activated in these cases. Thus, specific changes in activity types may be learned in advance.
During cross country skiing or other outdoor sports in cold weather, it may be assumed that the user is wearing gloves. Whether the wristband device is worn under a glove or sleeve may be determined based on ambient light sensor data. The system may then determine that the display may be deactivated to conserve power when the device is under gloves or other clothing. The ambient light sensor may detect light as the user moves the clothing to view the display of the wrist worn device. In this case, the change in the sensor data is a change in the amount of light. The display may then be activated in response to detecting a change in the sensor data.
Consider a ball game (e.g., a cricket or a soccer ball) as one type of activity. The device may recognize that the user is playing a ball game or the user may provide the type of activity as input. When the user is playing, the user may need to focus on the ball or other players within the area, and thus may not be interested in viewing the display on the wrist-worn device. Thus, the display may be deactivated while the game is in progress. The ongoing game may be detected by the device, for example, based on sensor data (e.g., acceleration sensor data), which may indicate a high frequency running, for example, followed by a subsequent hit with a stick. The strike may be identified by a derived hand trajectory, which may be calculated based on acceleration sensor data. If the player is resting in the game, he may be interested in viewing the display. The rest may be detected based on sensor data (e.g., acceleration sensor data and/or velocity derived from the acceleration sensor data).
The rules for activating the display may be predetermined or learned by the system, for example. For example, a user may provide an input to activate the display while sitting on a substitution table. The user may want to focus on his/her recovery and see what the heart rate should be reduced. The system may detect that the user has activated the display multiple times under the same circumstances. A threshold may be set for the number of times the system must detect the same behavior in order to learn the rules that activate the display. For example, if the same behavior is repeated, e.g., 3 times, the system may learn that the display will be activated in these cases. Thus, specific changes in activity types may be learned in advance.
The activation and deactivation of the display may be performed by a processor of the device. Fig. 3 shows, by way of example, a block diagram of a device 300. The device may be, for example, a smart watch. The device includes a display 310. The display may be, for example, a touch screen display. The device 300 may include, for example, one or more processing units, such as two or more processing units, e.g., a Low Power (LP) processor 315 and a High Power (HP) processor 320. The two or more processing units may each include a processing core. Each processing unit may include one or more unified or heterogeneous processor cores, and/or different volatile and non-volatile memories. For example, the device 300 may include a microprocessor having at least one processing core, and a microcontroller having at least one processing core. The processing cores may be of different types. For example, a processing core in a microcontroller may have more limited processing power and/or weaker memory technology than a processing core included in a microprocessor. In some embodiments, a single integrated circuit includes two processing cores, a first processing core having weaker processing power and consuming less power, and a second processing core having stronger processing power and consuming more power. In general, a first of the two processing units may have a weaker processing power and consume less power, and a second of the two processing units may have a stronger processing power and consume more power. Each processing unit (e.g., LP processor 315 and HP processor 320) may control a display 310 of device 300. The more capable processing unit may be configured to provide a richer visual experience via the display. The less capable processing unit may be configured to provide a weaker visual experience via the display. One example of a weaker visual experience is a reduced color display mode, rather than a rich color display mode. Another example of an impaired visual experience is a black and white visual experience. One example of a richer visual experience is the use of color. For example, the color may be displayed as 16 bits or 24 bits.
Both processing units may include a display interface configured to communicate to a display. For example, where the processing unit includes a microprocessor and a microcontroller, the microprocessor may include transceiver circuitry coupled to at least one metal pin under the microprocessor, the at least one metal pin being electrically coupled to the input interface of the display control device. The display control device (which may be included in the display) is configured to cause the display to display information in accordance with an electrical signal received in the display control device. Likewise, the microcontroller in this example may include a transceiver circuit coupled to at least one metal pin under the microcontroller, the at least one metal pin electrically coupled to the input interface of the display control device. The display control device may comprise two input interfaces each coupled to a respective one of the two processing units, or the display control device may comprise a single input interface to which both processing units are capable of providing input via their respective display interfaces. Thus, the display interface in the processing unit may comprise transceiver circuitry that enables the processing unit to send electrical signals to the display.
One of the processing units (e.g., the weaker or the stronger) may be configured to at least partially control the other processing unit. For example, a less capable processing unit (e.g., a less capable processing core) may cause a more capable processing unit (e.g., a more capable processing core) to transition into and out of a sleep state. These transitions may be caused by signaling via an internal processing unit interface, such as an inter-core interface.
The LP processor and HP processor may be controlled by a power controller 330. The power controller controls the clock frequency of the processor. When transitioning from an active state to a sleep state or a deactivated state, the transitioning processing unit may store its context, at least in part, in a memory, such as Pseudo Static Random Access Memory (PSRAM), SRAM, FLASH, or ferroelectric ram (fram). For example, both processors (LP processor 315 and HP processor 320) may be shut down and the context may be stored in memories 340, 345. If both the LP processor and the HP processor are turned off, the display is deactivated and power consumption can be reduced, for example, as much as possible. When transitioning from a hibernation state using a context stored in memory, the processing unit may resume processing more quickly, and/or from a location where the processing unit was in the hibernation state. In this way, the delay experienced by the user can be reduced. Occasionally alternative terms used in context include state and image. In the sleep state, the clock frequency of the processing unit and/or associated memory may be set to zero, which means that the processing unit is powered down and does not consume energy. The circuit configured to provide the operating voltage to the at least one processing unit may comprise, for example, a Power Management Integrated Circuit (PMIC). Since the device 300 comprises another processing unit, the sleeping processing unit may be completely powered down while maintaining the availability of the device 300.
The device is able to perform measurements during an active session even if both processors (LP and HP processor) are switched off. For example, the acceleration sensor 350 includes a processor, and is capable of continuously performing measurement with low power consumption and controlling the power controller 330. The acceleration sensor 350 may be provided with a battery power supply.
If the activity and/or context is deemed to require a fast wake-up of the core, the processing core may be placed in a reduced mode of operation. Waking from a sleep state or a fully off state may take longer.
When transitioning from the sleep state to the active state, the clock frequency of the transitioning processing unit may be set to a non-zero value. The processing unit making the transition may read the context from memory, where the context may comprise a previously stored context, e.g., a context stored in association with the transition to the sleep state, or the context may comprise a default state or context of the processing unit stored in the factory into memory. The memory may include, for example, pseudo Static Random Access Memory (SRAM), FLASH, and/or FRAM. The memory used by the processing unit to transition into and out of the sleep state may comprise, for example, DDR memory.
The context shown on the display may include activity-related or performance-related information related to the user's activity. The displayed information may include, for example, at least one or more measurements of heart rate, distance, speed, location, etc. The context, e.g., heart rate, distance, speed, location, etc., may be updated in response to activation of the display. The one or more sensors may provide information to the at least one processor. For example, a GPS sensor may provide location information to at least one processing core to enable a device to determine its location.
The information displayed on the display may include a thematic map, such as a heat map, which may be compiled to cover a geographic area. The user may participate in the active session in the geographic area. The activity types of such activity sessions may include, for example, jogging, swimming, cycling, and the like. When a user wishes to participate in his or her own activity session, his or her device may determine a route for the activity session based at least in part on the topic map database. Determining the route may include optionally designing the route based in part on user settings, based on where other users have engaged in the same type of activity session in the past. For example, a jogging route may be determined based at least in part on an indication of where other users have jogged in the past. The theme map may be downloaded from the server 370 through the communication interface 322.
The processor may generate a muted version of the subject map or may download the muted map from the server 370 as needed. The requirements may be based on the type of device, the user's preferences, and/or the user's location. The server may provide the appropriate campaign selections for download. An "attenuation" map is referred to herein as an attenuated version of the subject map. For example, this may mean one or several of the following: less or no color, lower display resolution, slower display update, reduced content, etc. The reduced topic map may be downloaded from the server 370 or may be generated by the first processing core 320 and stored in its memory 345. In a two processing core embodiment, the image of the diminished subject map may be copied (arrow 380) to the memory 340 of the low power second processing core 315 to be provided as a diminished visual experience through the display.
The device may include a Real Time Clock (RTC) module 360. When the processors (e.g., the LP processor and the HP processor) have been shut down, the RTC module may still be running. The RTC module or RTC unit may be a stand-alone unit provided with a battery power supply. Thus, the processing cores (e.g., the LP processor and the HP processor) may be completely shut down. The RTC unit may also be integrated into one of the processors 315 or 320, or into both processors, but thus requires power to be supplied on at least some of the hardware surrounding the processor in question, with a power consumption of a few microamps. Which alternative RTC cell to use depends on design choice. The RTC unit may trigger the processor at predetermined time intervals to update a context such as location. As another example, the RTC unit may start the processing core if a relatively long time has elapsed since the user last attempted to view the display. In those cases, the user may no longer be interested in finding a display image previously stored in memory that may no longer display the correct context, such as the correct location and/or activity of the user. The time delay from the last display action may be used as an indication that the context may have changed, rather than just displaying the stored content that may have passed. When the RTC unit exhibits this time delay, this information can be used, for example, to activate a GPS sensor in order to check the location and to start at least one processor, e.g., an LP processor, to update the context of the user. The context correlation image may be obtained from memory using a Low Dropout (LDO) regulator as a power source for the hibernating processing core. The LDO regulator can provide fast wake-up. The stored image may be transferred directly from the memory to the display. The memory may be an internal memory 340, 345 or an external memory.
The night time may be determined based on the current time provided by the RTC module 360 that the user is typically sleeping. Sleep may be considered an activity type. The user may also input for the device that the current activity type is sleep. It may be possible to confirm, based on the sensor data, for example, that it is dark (based on ambient light sensor data) and/or that the user is not moving (based on motion sensor data) or is moving very little, for example changing a posture on a bed. Then, it may be determined that the display may be deactivated and remain deactivated during the sleep time. The display may remain deactivated even if motion is detected from the sensor data (e.g., acceleration sensor data). For example, if the detected motion is below a predetermined threshold, the display may be kept deactivated. The threshold value may be a predetermined reference value. It may further be provided that during sleep time the display remains inactive even if the user stands up and walks. In this way no disturbing light from the display is emitted towards the eyes of the user. Furthermore, power is saved since the display is not unnecessarily activated.
However, the system may learn user-specific rules for activation during sleep. The user may wish to view the display during sleep time. The user may provide an input to activate the display. The user may activate the display by a user input, such as pressing a button or touching a touch screen. The system may detect that the user has activated the display when waking up at night and has tilted the wrist so that the display is facing the user's face. Then, a user specific rule may be determined: even at night or during sleep time, the display is activated if a wrist tilt or a trajectory corresponding to pointing the watch towards the face is detected based on sensor data (e.g., acceleration sensor data).
The triggering event for causing power controller 330 to control the processor to activate and/or deactivate the display may be based at least on sensor data (e.g., data of acceleration sensor 350) and/or the like. In addition, the current time data may be used to control the display. The different situations of activating and/or deactivating the display have been described above in the context of different types of activities. Common to these examples is that a change in the measured value derived from the sensor data is detected. For example, a particular change in activity type may be detected. To expedite wake-up of sleeping or shut-down processing cores, their power supplies (e.g., switched-mode power supplies, SMPS) may remain on. Another exemplary way is to shut down the SMPS and connect a Low Dropout (LDO) regulator as a fast power supply in parallel with respect to the SMPS that is a sleep or shut down processing core.
The display may be deactivated if the device is placed on a table. It may be detected that the device is located on a table, i.e. on a plane, based on sensor data, e.g. acceleration sensor data. When the device is on a table, no motion is detected and therefore the change in sensor data is to stop motion.
The system may also learn the activation rules of the display during daily use when the user is not engaged in physical activity. Everyday use without athletic activity or athletic performance may be considered a type of exercise. The activity type may be determined as a daily use without athletic activity, for example, based on sensor data, based on user input, or based on a knowledge that the user is not currently recording any athletic performance. For example, the user may wear the device on the wrist and the display of the device may be deactivated. The processor for controlling the display may be turned off when the display is deactivated. The user may provide an input to activate the display after tilting the wrist to orient the display of the device toward the user's face. The system may detect that the user has activated the display after similarly tilting the device multiple times (e.g., a predetermined number of times). Thus, the system learns to activate the display in response to detecting a predetermined change in the sensor data. The change may be, for example, the tilting of the device. For example, the tilt may be detected based on sensor data (e.g., acceleration data).
When the activity type is walking or jogging, the display of the device may be activated in response to detecting that the wrist is tilted such that the display is facing the face of the user.
Fig. 4 shows, by way of example, a flow chart of a method for activating a display. The method 400 includes receiving 410 an activity type of a user. The method 400 includes receiving 420 sensor data. The method 400 includes determining 430 at least one measurement based on the sensor data. The method 400 comprises detecting 440 at least one first activity type specific change in at least one measurement value. The method 400 includes activating 450 a display in response to detecting at least one first activity type-specific change in at least one measurement.
The sensor data may include, for example, one or more of acceleration sensor data, gyroscope data, altimeter data, heart rate sensor data, ambient light sensor data, and location sensor data.
The measured values of the measured quantities may be, for example, one or more of acceleration, velocity, distance, inclination, altitude, heart rate, ambient light and position.
The activity type may be physical activity, daily use, or sleep. The sporting activity may be, for example, riding, cross-country skiing, indoor gaming, jogging or walking.
This approach provides more accurate display activation in situations where the user is interested in viewing information displayed on the display. Those conditions may be determined based on sensor data. Additionally, those cases may be determined based on the activity type. This results in more efficient power consumption since the display is not unnecessarily activated. Batteries may be saved because the display is deactivated when the user does not want to view information displayed on the display. The method can prevent false positives and false negatives associated with certain activation and/or deactivation methods.
Fig. 5A and 5B show, by way of example, a flow chart of display activation and deactivation in relation to different activity types (e.g., cycling 510 and swimming 550). For example, the user may be determined to be riding based on user input or based on sensor data as described above. Sensor data is received during the activity. At least one of the measurements is height and the measurement is a height value 515. A first activity type specific change in the measurement value may be detected, e.g. a change in altitude above a predetermined threshold 520 may be detected. In response to detecting the first activity type-specific change in the measurement, a display is activated 525. Then, a second activity-type specific change in the measurement value may be detected, e.g. a change in altitude below a predetermined threshold 530 may be detected. In response to detecting the second activity type-specific change in the measurement, the display is deactivated 535.
Referring to fig. 5B, it may be determined that the user is swimming, e.g., based on user input or based on sensor data as described above. Sensor data is received during the activity. At least one measured quantity is a speed and the measured value is a speed value 555. A first activity type specific change in the measurement value may be detected, e.g. a speed drop below a predetermined threshold 560 may be detected. In response to detecting the first activity type-specific change in the measurement value, the display is activated 565. Then, a second activity type specific change in the measurement value may be detected, e.g. a speed above a predetermined threshold 570 may be detected. In response to detecting the second activity type-specific change in the measurement, the display is disabled 575. Further, user-specific rules may be received. For example, the user has determined or the system has learned: the user wants to view 580 the display after 100m each swim. Thus, in addition to speed-based display activation, the display may also be activated based on user-specific display activation rules. Thus, the measured quantity is a distance and the measured value is a distance value 585. A first activity-type specific change in the measured values may be detected, for example a distance greater than or equal to n x 100, where n is 1,2,3 …. This means that a first change in 590 distance is detected every 100 m. In response to detecting the first activity type specific change in the measurement value, the display is activated 595, for example for a predetermined time, so that the user has time to view the display without actually stopping.
Fig. 6 shows a block diagram of an apparatus by way of example. The illustrated device includes a microcontroller 610 and a microprocessor 620. Microcontroller 610 may comprise, for example, a Silabs EMF32 or Renesas RL78 microcontroller, or the like. The microprocessor 620 may include, for example, a Qualcomm Snapdragon processor or an ARM Cortex based processor. In the embodiment of FIG. 6, microcontroller 610 and microprocessor 620 are communicatively coupled with an inter-core interface, which may include, for example, a serial or parallel communication interface. In general, the interface disposed between the microcontroller 610 and the microprocessor 620 may be considered an inter-processing unit interface.
In the example shown, microcontroller 610 is communicatively coupled with buzzer 670, Universal Serial Bus (USB), interface 680, pressure sensor 690, acceleration sensor 6100, gyroscope 6110, magnetometer 6120, satellite positioning circuitry 6130, bluetooth interface 6140, user interface buttons 6150, and touch interface 6160. For example, pressure sensor 690 may include an atmospheric pressure sensor.
In response to a triggering event, microcontroller 610 can cause microprocessor 620 to transition from a sleep state to an active state. For example, in the embodiment of fig. 6, since cellular interface 640 is controllable by microprocessor 620 and cannot be used directly by microcontroller 610, microcontroller 610 may transition microprocessor 620 to an active state in the event that the user indicates, e.g., via button 6150, that he wishes to initiate a cellular communication connection. In some embodiments, when the microprocessor 620 is in a sleep state, the cellular interface 640 is also in a sleep state. Cellular interface 640 may include, for example, an electrical interface to a cellular transceiver. The cellular interface 640 may include control circuitry for a cellular transceiver.
In various embodiments, at least two of the elements shown in FIG. 6 may be integrated on the same integrated circuit. For example, microprocessor 620 and microcontroller 610 may be provided as processing cores in the same integrated circuit. In this case, for example, cellular interface 640 may be a cellular interface of the integrated circuit included in the integrated circuit, where cellular interface 640 may be controlled by microprocessor 620 rather than microcontroller 610. In other words, various hardware features of the integrated circuit may be controlled by one of the microcontroller 610 and the microprocessor 620, but not both. On the other hand, certain hardware features may be controlled by any processing unit. For example, in such an integrated embodiment, USB interface 660 and USB interface 680 may be the same USB interface of an integrated circuit and may be controlled by either processing core.
Fig. 7 shows, by way of example, a block diagram of an apparatus. The device 700 shown may comprise, for example, the wearable device 110 of fig. 1, such as a sports watch or a smart watch. Included in device 700 is processor 710, which may comprise, for example, a single-core or multi-core processor, where a single-core processor comprises one processing core and a multi-core processor comprises more than one processing core. The processor 710 may generally include a control device. The processor 710 may include more than one processor, such as the LP processor and the HP processor shown in FIG. 3. The processor 710 may be a control device. The treatment core may include, for example, a Cortex-A8 treatment core manufactured by ARM Holding, or a Steamroller treatment core designed by Advanced Micro Devices Corporation. Processor 710 may include at least one Qualcomm Snapdagon and/or Intel Atom processor. Processor 710 may include at least one Application Specific Integrated Circuit (ASIC). The processor 710 may include at least one Field Programmable Gate Array (FPGA). The processor 710 may be a means for performing method steps in the device 700. Processor 710 may be configured, at least in part, by computer instructions, to perform actions.
It is to be understood that the disclosed embodiments of the invention are not limited to the particular structures, process steps, or materials disclosed herein, but extend to equivalents thereof as may be recognized by those ordinarily skilled in the pertinent art. It is also to be understood that the terminology used herein is for the purpose of describing particular embodiments only, and is not intended to be limiting.
Reference throughout this specification to "one embodiment" or "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, the appearances of the phrases "in one embodiment" or "in an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. When a numerical value is referred to using terms such as "for example," "approximately," or "substantially," the exact numerical value is also disclosed.
Various items, structural elements, compositional elements, and/or materials may be presented in a common list for convenience. However, these lists should be construed as though each member of the list is individually identified as a separate and unique member. Thus, any single element of the list should not be construed as equivalent to any other element of the same list solely based on their presentation in a common group without indications to the contrary. Additionally, various embodiments and examples of the invention may relate to alternatives for various components thereof. It should be understood that such embodiments, examples, and alternatives are not to be construed as actual equivalents of each other, but are to be considered as independent and autonomous representations of the invention.
Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the description herein, numerous specific details are provided, such as examples of lengths, widths, shapes, etc., to provide a thorough understanding of embodiments of the invention. One skilled in the relevant art will recognize, however, that the invention may be practiced without one or more of the specific details, or with other methods, components, materials, and so forth. In other instances, well-known structures, materials, or operations are not shown or described in detail to avoid obscuring aspects of the invention.
While the above examples illustrate the principles of the invention in one or more particular applications, it will be apparent to those of ordinary skill in the art that numerous modifications in form, usage and details can be made without the exercise of inventive faculty, and without departing from the principles and concepts of the invention. Accordingly, the invention is not intended to be limited except as by the appended claims.
The verbs "comprise" and "comprise" are used herein as open-ended definitions, neither excluding nor necessitating the presence of other unrecited features. The features recited in the dependent claims may be freely combined with each other, unless explicitly stated otherwise. In addition, it is to be understood that the use of "a" or "an" (i.e., singular forms) herein does not exclude a plurality.
Claims (19)
1. An apparatus, comprising: at least one processor; and at least one memory including computer program code, the at least one memory and the computer program code configured to, with the at least one processor, cause the apparatus at least to perform:
-receiving an activity type of a user;
-receiving sensor data;
-determining at least one measurement value based on the sensor data;
-detecting at least one first activity type specific change in at least one measurement value; and
-activating the display in response to detecting at least one first activity type specific change in the at least one measurement value.
2. The apparatus of claim 1, wherein detecting at least one first activity type-specific change in at least one measurement value comprises:
-detecting a change in the measurement value; and
-comparing the detected change with an activity type specific reference value.
3. The apparatus according to claim 1 or 2, wherein the activity type specific variation is pre-learned.
4. The apparatus according to claim 2 or 3, characterized in that it is able to perform:
-receiving one or more user-specific display activation rules; and
-updating the activity type specific reference value in accordance with the one or more user specific display activation rules.
5. The apparatus according to any of the preceding claims, wherein the apparatus is capable of performing:
-receiving one or more user-specific display activation rules; and
-activating the display also in accordance with the one or more user-specific display activation rules.
6. The apparatus according to any of the preceding claims, wherein the apparatus is capable of performing:
-detecting a second activity type specific change in the measurement value; and
-deactivating the display in response to detecting a second activity type specific change in the measurement value.
7. Device according to any one of the preceding claims, characterized in that the activity type is an athletic activity, daily use or sleep.
8. The apparatus of any of the preceding claims, wherein the sensor data comprises one or more of acceleration sensor data, gyroscope data, altimeter data, heart rate sensor data, ambient light sensor data, and location sensor data.
9. The apparatus of any of the preceding claims, wherein the at least one measurement is one or more of acceleration, velocity, distance, inclination, altitude, heart rate, ambient light, and position.
10. A method, comprising:
-receiving an activity type of a user;
-receiving sensor data;
-determining at least one measurement value based on the sensor data;
-detecting at least one first activity type specific change in at least one measurement value; and
-activating the display in response to detecting at least one first activity type specific change in the at least one measurement value.
11. The method of claim 10, wherein detecting at least one first activity type-specific change in at least one measurement value comprises:
-detecting a change in the measurement value; and
-comparing the detected change with an activity type specific reference value.
12. The method according to claim 10 or 11, wherein the activity type specific changes are pre-learned.
13. The method of claim 11 or 12, further comprising:
-receiving one or more user-specific display activation rules; and
-updating the activity type specific reference value in accordance with the one or more user specific display activation rules.
14. The method of any of claims 10 to 13, further comprising:
-receiving one or more user-specific display activation rules; and
-activating the display also in accordance with the one or more user-specific display activation rules.
15. The method of any of claims 10 to 14, further comprising:
-detecting a second activity type specific change in the measurement value; and
-deactivating the display in response to detecting a second activity type specific change in the measurement value.
16. The method according to any one of claims 10 to 15, wherein the activity type is an athletic activity, a daily use, or sleep.
17. The method of any one of claims 10 to 16, wherein the sensor data comprises one or more of acceleration sensor data, gyroscope data, altimeter data, heart rate sensor data, ambient light sensor data, and location sensor data.
18. The method of any one of claims 10 to 17, wherein the at least one measurement is one or more of acceleration, velocity, distance, inclination, altitude, heart rate, ambient light and position.
19. A non-transitory computer readable medium comprising program instructions that when executed by at least one processor enable an apparatus to at least:
-receiving an activity type of a user;
-receiving sensor data;
-determining at least one measurement value based on the sensor data;
-detecting at least one first activity type specific change in at least one measurement value; and
-activating the display in response to detecting at least one first activity type specific change in the at least one measurement value.
Applications Claiming Priority (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US16/731,104 | 2019-12-31 | ||
| US16/731,104 US11587484B2 (en) | 2015-12-21 | 2019-12-31 | Method for controlling a display |
| US16/731104 | 2019-12-31 |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| CN113126752A true CN113126752A (en) | 2021-07-16 |
| CN113126752B CN113126752B (en) | 2024-03-12 |
Family
ID=74221008
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202011616222.8A Active CN113126752B (en) | 2019-12-31 | 2020-12-30 | Method for controlling display |
Country Status (5)
| Country | Link |
|---|---|
| CN (1) | CN113126752B (en) |
| DE (1) | DE102020007511A1 (en) |
| FI (1) | FI129844B (en) |
| GB (1) | GB2591872B (en) |
| TW (1) | TW202133130A (en) |
Families Citing this family (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| GB2619337A (en) * | 2022-06-01 | 2023-12-06 | Prevayl Innovations Ltd | A wearable article, an electronics module for a wearable article and a method performed by a controller for an electronics module for a wearable article |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20130234924A1 (en) * | 2012-03-07 | 2013-09-12 | Motorola Mobility, Inc. | Portable Electronic Device and Method for Controlling Operation Thereof Based on User Motion |
| US9032321B1 (en) * | 2014-06-16 | 2015-05-12 | Google Inc. | Context-based presentation of a user interface |
| US20150286285A1 (en) * | 2014-09-23 | 2015-10-08 | Fitbit, Inc. | Methods, systems, and apparatuses to display visibility changes responsive to user gestures |
| US20160379547A1 (en) * | 2015-06-29 | 2016-12-29 | Casio Computer Co., Ltd. | Portable electronic device equipped with display, display control system, and display control method |
Family Cites Families (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US9159294B2 (en) * | 2014-01-31 | 2015-10-13 | Google Inc. | Buttonless display activation |
| US10146297B2 (en) * | 2014-03-06 | 2018-12-04 | Polar Electro Oy | Device power saving during exercise |
| GB2555107B (en) * | 2016-10-17 | 2020-10-21 | Suunto Oy | Embedded Computing Device |
-
2020
- 2020-12-05 DE DE102020007511.8A patent/DE102020007511A1/en active Pending
- 2020-12-14 FI FI20206293A patent/FI129844B/en active IP Right Grant
- 2020-12-22 GB GB2020358.4A patent/GB2591872B/en active Active
- 2020-12-28 TW TW109146437A patent/TW202133130A/en unknown
- 2020-12-30 CN CN202011616222.8A patent/CN113126752B/en active Active
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20130234924A1 (en) * | 2012-03-07 | 2013-09-12 | Motorola Mobility, Inc. | Portable Electronic Device and Method for Controlling Operation Thereof Based on User Motion |
| US9032321B1 (en) * | 2014-06-16 | 2015-05-12 | Google Inc. | Context-based presentation of a user interface |
| US20150286285A1 (en) * | 2014-09-23 | 2015-10-08 | Fitbit, Inc. | Methods, systems, and apparatuses to display visibility changes responsive to user gestures |
| US20160379547A1 (en) * | 2015-06-29 | 2016-12-29 | Casio Computer Co., Ltd. | Portable electronic device equipped with display, display control system, and display control method |
Also Published As
| Publication number | Publication date |
|---|---|
| TW202133130A (en) | 2021-09-01 |
| GB2591872A (en) | 2021-08-11 |
| GB2591872B (en) | 2023-10-11 |
| CN113126752B (en) | 2024-03-12 |
| FI129844B (en) | 2022-09-30 |
| GB202020358D0 (en) | 2021-02-03 |
| FI20206293A1 (en) | 2021-07-01 |
| DE102020007511A1 (en) | 2021-07-01 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20230140011A1 (en) | Learning mode for context identification | |
| US10220258B2 (en) | Method and device for providing workout guide information | |
| KR101944630B1 (en) | System and method for processing video content based on emotional state detection | |
| CN114578948B (en) | Wristwatch apparatus, method and computer program product | |
| US20200110614A1 (en) | Energy management system and method, electronic device, electronic apparatus, and nonvolatile processor | |
| CN104871607B (en) | Low power always on for indoor and outdoor states OK | |
| US20180043212A1 (en) | System, method, and non-transitory computer readable medium for recommending a route based on a user's physical condition | |
| US20140120838A1 (en) | Distributed systems and methods to measure and process sport motions | |
| US10867013B2 (en) | Information processing apparatus, information processing method, and program | |
| US20180048996A1 (en) | Location and activity aware content delivery system | |
| US20170176213A1 (en) | Sensor based context management | |
| JP2013208266A (en) | Pacemaker apparatus, operation method thereof, and program | |
| US11587484B2 (en) | Method for controlling a display | |
| CN113126752B (en) | Method for controlling display | |
| US20170256236A1 (en) | Portable electronic device and display method | |
| CN106030442B (en) | Method and apparatus for selecting interactive equipment | |
| CN106994238B (en) | Data processing method and electronic device for executing the same | |
| US20200151182A1 (en) | Apparatus and method for presenting thematic maps | |
| JP2020017078A (en) | Information processing device, healthcare management system, and program | |
| US20190142307A1 (en) | Sensor data management | |
| TWI729596B (en) | Sensor data management | |
| US20230273048A1 (en) | Information processing apparatus, information processing system, information processing method, and program | |
| CN113127589A (en) | Apparatus and method for presenting theme map | |
| CN113127588A (en) | Apparatus and method for presenting theme map | |
| TWI614683B (en) | Method, system for providing location-based information in response to a speed, and non-transitory computer-readable medium |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| TA01 | Transfer of patent application right | ||
| TA01 | Transfer of patent application right |
Effective date of registration: 20221013 Address after: Finland Vantaa Applicant after: Songtuo Co. Address before: Finland Vantaa Applicant before: Amer Sports Digital Services OY |
|
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| GR01 | Patent grant | ||
| GR01 | Patent grant |