WO2019183223A1 - Audio coach for running injury protection - Google Patents
Audio coach for running injury protection Download PDFInfo
- Publication number
- WO2019183223A1 WO2019183223A1 PCT/US2019/023169 US2019023169W WO2019183223A1 WO 2019183223 A1 WO2019183223 A1 WO 2019183223A1 US 2019023169 W US2019023169 W US 2019023169W WO 2019183223 A1 WO2019183223 A1 WO 2019183223A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- signal
- sensor
- impact
- feedback
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R1/00—Details of transducers, loudspeakers or microphones
- H04R1/10—Earpieces; Attachments therefor ; Earphones; Monophonic headphones
- H04R1/1091—Details not provided for in groups H04R1/1008 - H04R1/1083
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
- A61B5/112—Gait analysis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/48—Other medical applications
- A61B5/486—Biofeedback
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6802—Sensor mounted on worn items
- A61B5/6803—Head-worn items, e.g. helmets, masks, headphones or goggles
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient; User input means
- A61B5/7405—Details of notification to user or communication with user or patient; User input means using sound
- A61B5/7415—Sound rendering of measured values, e.g. by pitch or volume variation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2562/00—Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
- A61B2562/02—Details of sensors specially adapted for in-vivo measurements
- A61B2562/0204—Acoustic sensors
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
- A61B5/1123—Discriminating type of movement, e.g. walking or running
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B7/00—Instruments for auscultation
Definitions
- Runner’s often get injuries due to poor form and/or repetitive stress.
- a runner’s stride and“impact” may be correlated with the chances of developing an injury.
- a runner’s ground reaction force varies with the runner, and is a measure of force (over time) as the runner’s foot first makes contact with the ground, as the runner’s weight moves over the foot, and as the runner pushes off the ground for the next step.
- a first peak in the ground reaction force is an impact peak, and may be the point of the largest stress on the foot, ankle, shin, etc. The magnitude and timing of the first peak may play a significant role in the risk of injury.
- a second peak in the ground reaction force may represent an active force (e.g., an active peak) of the runner pushing off the ground for the next step, and may also be indicative of the risk of injury.
- Headphone systems may therefore provide a platform for advantageous use as a feedback device.
- aspects and examples are directed to headphone systems and methods that detect aspects of a user’s activity, such as running, and may provide feedback or instruction to the user about the user’s performance. Feedback or instruction may be provided through an audio message or alert played through the headphones, or by visual indicators associated with the headphones.
- a headphone system includes an earpiece, an acoustic driver coupled to the earpiece to render an audio signal, a sensor to provide a sensor signal indicative of an aspect of the user’s movement, and a detection circuit configured to receive the sensor signal and to provide a feedback based at least upon the aspect of the user’ s movement.
- the sensor may be a microphone to detect an environmental acoustic signal indicative of the user’s impact with the ground.
- the sensor may be an inertial sensor to detect accelerations associated with the user’s impact with the ground.
- the sensor may be an impact sensor to detect forces associated with the user’s impact with the ground.
- the sensor may be an electromyography sensor to detect muscle activity associated with the user’s stride.
- the feedback may include any of an audible cue injected into the audio signal, a positive reinforcement, a negative reinforcement, an instruction, a warning, and/or a pacing signal.
- the detection circuit may be configured to increase or decrease a rate of a pacing signal based upon the aspect of the user’ s movement.
- the feedback includes a visible indication.
- a method of providing feedback to a headphone user includes receiving an audio signal to be converted to an acoustic signal, receiving a sensor signal indicative of an aspect of the user’s movement, analyzing the sensor signal to detect the aspect of the user’s movement, providing feedback based upon the aspect of the user’s movement.
- Providing feedback may include modifying the audio signal and rendering the modified audio signal into the acoustic signal.
- Modifying the audio signal may include injecting into the audio signal at least one of a positive reinforcement, a negative reinforcement, an instruction, a warning, or a pacing signal.
- the sensor signal may include one or more of an acoustic signal, a microphone signal, an accelerometer signal, an impact signal, a force indicating signal, and an electromyography signal.
- the sensor signal may be indicative of the user’s impact, pace, or stride.
- FIG. 1 is a perspective view of an example headphone set
- FIG. 2 is a left-side view of an example headphone set
- FIG. 3 is a schematic diagram of an example headphone set
- FIG. 4 is a flow chart of an example method that may be carried out by a headphone set.
- FIG. 5 is a schematic diagram of a signal processing method that may be earned out by a headphone set.
- aspects and examples are directed to headphone systems and methods that detect aspects of a user’s activity, such as running.
- Aspects of running may include the user’s stride, gate, impact, e.g., vertical ground reaction force, heel-toe characteristics, and the like.
- Headphone systems and methods disclosed herein may provide feedback or instruction to the user about the user’s performance, and in particular examples may provide feedback or instruction intended to reduce the possibility of injury due to improper form or repetitive stress, for example. Feedback or instruction may be provided through an audio message or alert played through the headphones, or by visual indicators associated with the headphones.
- any on-ear, in-ear, over-ear, or off-ear form-factors of personal acoustic devices are intended to be included by the terms “headset,” “headphone,” and “headphone set.”
- the term“earpiece” is intended to include any portion of such form factors in proximity to at least one of a user’s
- FIG. 1 illustrates one example of a headphone set.
- the headphones 100 include two earpieces, e.g., a right earcup 102 and a left earcup 104, coupled to a right yoke assembly 108 and a left yoke assembly 110, respectively, and intercoupled by a headband 106.
- the right earcup 102 and left earcup 104 include a right circumaural cushion 112 and a left circumaural cushion 114, respectively. Visible on the left earcup 104 is a left interior surface 116.
- either of the earcups 102, 104 may include one or more sensors, such as microphones, inertial sensors (e.g., accelerometer, gyroscope, compass), radio receivers, etc.
- sensors such as microphones, inertial sensors (e.g., accelerometer, gyroscope, compass), radio receivers, etc.
- the example headphones 100 illustrated in FIG. 1 include two earpieces, some examples may include only a single earpiece for use on one side of the head only.
- the example headphones 100 include a headband 106
- other examples may include different support structures to maintain one or more earpieces (e.g., earcups, in-ear structures, neckband, etc.) in proximity to a user’s ear
- an earbud may include a shape and/or materials configured to hold the earbud within a portion of a user’s ear
- a personal speaker system may include a neckband to support and maintain acoustic driver! s) near the user’s ears, shoulders, etc.
- FIG. 2 illustrates multiple example placements of sensors on the headphones 100, any one or more of which may be included in certain examples.
- FIG. 2 illustrates the headphones 100 from the left side and shows details of the left earcup 104 including a pair of front sensors 202, which may be nearer a front edge 204 of the earcup, and a rear sensor 206, which may be nearer a rear edge 208 of the earcup.
- the right earcup 102 may additionally or alternatively have a similar arrangement of front and rear sensors, though in examples the two earcups may have a differing arrangement in number or placement of sensors.
- various examples may have more or fewer sensors 202, 206 in various placement about a headphone, which may include sensors on the headband 106, a neckband, chin strap, etc., and in some examples sensors may be provided as an accessory sensor worn elsewhere on the user’s body, such as in a shoe or on a waistband, for instance. While not specifically illustrated in FIGS. 1 and 2, one or more acoustic drivers may be provided in the right and/or left earcups 102, 104 to provide audio playback to the user.
- the sensors 202, 206 may be of various types, such as acoustic sensors (e.g., microphones, ultrasonic, sonar systems, etc.), inertial sensors, electromagnetic or radio sensors (e.g., radio receivers, radar devices, etc.), and may be coupled to additional sensor components, e.g., a force plate in a shoe may transmit a signal to one of the sensors 202, 206. In some examples, only one sensor may be necessary.
- acoustic sensors e.g., microphones, ultrasonic, sonar systems, etc.
- inertial sensors e.g., inertial sensors
- electromagnetic or radio sensors e.g., radio receivers, radar devices, etc.
- additional sensor components e.g., a force plate in a shoe may transmit a signal to one of the sensors 202, 206. In some examples, only one sensor may be necessary.
- FIG. 3 is a schematic block diagram of an example headphone system 300, such as for the headphones 100.
- Hie headphone system 300 includes a controller 310 that provides signals to acoustic drivers 320 for audio playback (e.g., right driver 320a, left driver 320b).
- the controller 310 includes a processor 312, an audio interface 314, and may include a battery 316 and/or additional components.
- the audio interface 314, for example, may be a wired input or may be a wireless interface configured, at least in part, to receive program content signals for audio playback.
- the controller 310 may also receive signals from various sensors, including the sensors 202, 206 (see FIG. 2), for example.
- one or more of the sensors 202, 206 may be acoustic sensors, such as microphones, and may provide microphone signals to be analyzed by the processor 312 to determine aspects of the user’s performance.
- one or more of the sensors 202, 206 may be an inertial sensor, to sense movement of the user’s head for the processor 312 to determine the user ’ s stride, impact, etc.
- one or more of the sensors 202, 206 may be a radio or other electromagnetic sensor (e.g., antenna, BluetoothTM receiver, etc.) and may be coupled with a further sensor component, such as a shoe mounted accelerometer or force sensor, wrist-worn sensor, waistband sensor, and/or other suitable sensors for detecting aspects of the user’s movement.
- a radio or other electromagnetic sensor e.g., antenna, BluetoothTM receiver, etc.
- a further sensor component such as a shoe mounted accelerometer or force sensor, wrist-worn sensor, waistband sensor, and/or other suitable sensors for detecting aspects of the user’s movement.
- one or more of the sensors 202, 206 may be microphones whose signals are processed by the processor 312 to detect the sound of the user’s stride and determine an estimate of the user’s ground reaction force.
- various techniques may be applied to isolate the sound of the user’s feet contacting the ground. For example, array processing of the microphone signals may direct a beam toward the user’s feet, and/or frequency band filtering may be applied to process only portions of the spectrum expected to have content related to the user’s feet interacting with the ground, to name a few.
- one or more of the sensors 202, 206 may be inertial sensors whose signals may be
- various sensor types may provide signals that are analyzed in parallel, and results combined and/or correlated (e.g., for enhanced precision and/or confirmatory purposes), to determine an estimate of the user’s ground reaction force.
- inertial sensor signals and microphone signals may be processed as discussed above and correlated with each other, to confirm timing and/or impact of the user’s activity, or combined in a manner to increase precision.
- one or more accessory sensors may provide sensor signals to be processed to determine an estimate of the user’s ground reaction force.
- a shoe insert may measure force directly and may transmit a sensor signal to the headphones.
- an inertial sensor worn elsewhere on the body e.g., ankle-band, wrist-band, waistband
- a cardiologic monitor e.g., heart-rate, blood pressure
- electromyography sensor e.g., muscle activity
- Any one or more of such sensor signals, individually or in combination, may be processed by the processor 312 to determine aspects of the user’s physical activity upon which to base feedback and/or instruction to be provided to the user.
- an impact peak may be a peak force indicative of the initial‘landing” of the user’s foot, and a slope of a ground reaction force curve may be indicative of what the impact peak will be, prior to the peak occurring. Either of the actual impact peak value, or the slope of the ground reaction force leading up to the peak, or a combination of the two, may be used to evaluate whether the user is moving in such a way as to increase or decrease risk of injury. Evaluation of other portions of the ground reaction force (e.g., pushing off, or an active peak) may also be used to evaluate form and/or risk of injury in various examples. Further in various examples, other measurements not related to ground reaction force may be used to evaluate form and/or risk of injury. FIG.
- a controller or processing system such as the controller 310, receives signals from one or more sensors (block 410), and analyzes the signals (block 420) to detect a condition of the user (block 430), e.g., to evaluate or determine an aspect of the user’s physical motion, form, and/or risk of injury.
- the controller or processing system may provide feedback (block 440) or instruction to the user.
- the controller or processing system may inject an audio signal to be rendered by an acoustic driver to provide a message, such as“speed up,”“slow down,”“shift weight forward,” “shift weight back,” or any of various messages as determined by an appropriate program function taking into account various best practices for, e.g., running form.
- an audible feedback may include a beep or tick or other indicator that indicates a proper timing, e.g., for running stride.
- an audible feedback may include a harsh tone when an impact is determined to exceed a threshold, and/or a pleasant tone when an impact is determined to remain below' a threshold, for instance.
- a visible feedback may be provided, such as a flashing light source that indicates a proper or improper timing, stride, or impact force (e.g., above or below' a threshold).
- a visible feedback may include color to indicate feedback to the user, such as a green light when the user is performing within an acceptable parameter range and a yellow or red light when outside the acceptable range, for example.
- Various color options may indicate to speed up, slow' down, or maintain pace across various examples.
- Various color options may indicate to shift weight forward, back, or maintain weight balance across various examples.
- any variation or combination of audible and visible cues may be provided to provide feedback or instruction to the user.
- feedback to be provided may be based upon the detected user’s condition (block 430) in combination with a particular program and/or user profile.
- a user for example, a user’s height, weight, body mass index, body mass proportions, gender, user condition (e.g., shin-splints, susceptibility, etc.) and/or other physical parameters) may be configurable and stored in a memory associated with a controller, e.g., the controller 310, and may be used to determine what feedback to be provided, given various sensor signals.
- a program may include various settings, such as running surface characteristics, whether the activity is for cardio or other metabolic purposes, training for speed or distance, total time, risk tolerance, and/or other various parameters, in various examples.
- a controller or processing system such as the controller 310, may be programmed or trained to provide proper feedback or instruction to a user through artificial intelligence and/or machine learning processes, and such programming or training may be performed at manufacture for a general or average user, or may be customized by the user and/or performed through a procedure executed by the user.
- a user’s proper form may be exhibited (or performed) while wearing the headphones, and the controller 310 may‘learn” the proper sensor signals that indicate the proper performance.
- inertial sensor signals and/or the sound of the user’s feet may be monitored and characterized such that proper form, and improper forms, may be determined in the future by comparison to the“learned,” characterized, or exemplary sensor signals.
- a controller such as the controller 310 may process signals from various sensors (e.g., sensors 202, 206) in a manner to provide enhanced and/or reduced response in certain directions, e.g., via array processing techniques. For example, and as illustrated in FIG. 5, an example processing method 500 is shown.
- the sensors 202, 206 may provide two or more individual signals 502 to be combined with array processing, e.g., by the controller 310, to implement a beam former 510 to produce a signal 512 having enhanced response in a particular beam direction.
- two or more of the individual signals 502 may be combined with array processing, e.g., by the controller 310, to implement null steering 520 to produce a signal 522 having reduced response in a particular null direction.
- the signal 512 may be produced with an enhanced response in a selected direction, such as toward the feet of the user, for example, to enhance the likelihood of detection of running acoustics, e.g., foot impact.
- the signal 522 may be a reference signal.
- the signal 522 may be formed by null steering 520 to have reduced response in the direction of the user's foot. Accordingly, a signal component present in the signal 512 but absent from the signal 522 may provide confidence that the signal component is originating in the direction of the user's foot.
- signals from acoustic sensors may be array processed in a manner illustrated by the example shown in FIG. 5, while in broader examples signals of various types may be array processed in an analogous manner.
- any of the functions or methods, and any components of systems may be implemented or carried out in a digital signal processor (DSP), a microprocessor, a logic controller, logic circuits, and the like, or any combination of these, and may include analog and/or digital circuit components and/or other components with respect to any particular implementation.
- DSP digital signal processor
- functions and components disclosed herein may operate in the digital domain and certain examples include analog-to- digital (ADC) conversion of analog signals provided, e.g., by microphones, despite the lack of illustration of ADC’s in the various figures.
- ADC analog-to- digital
- Any suitable hardware and/or software, including firmware and the like may be configured to carry out or implement components of the aspects and examples disclosed herein, and various implementations of aspects and examples may include components and/or functionality in addition to those disclosed.
- references to“or” may be construed as inclusive so that any terms described using“or” may indicate any of a single, more than one, and all of the described terms. Any references to front and back, left and right, top and bottom, tipper and lower, and vertical and horizontal are intended for convenience of description, not to limit the present systems and methods or their components to any one positional or spatial orientation.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Medical Informatics (AREA)
- Animal Behavior & Ethology (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Veterinary Medicine (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Public Health (AREA)
- General Health & Medical Sciences (AREA)
- Signal Processing (AREA)
- Acoustics & Sound (AREA)
- Biodiversity & Conservation Biology (AREA)
- Physiology (AREA)
- Dentistry (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Headphones And Earphones (AREA)
Abstract
Headphone systems and methods include a sensor to provide a sensor signal indicative of an aspect of a user's movement, such as running stride, pace, or impact force with the ground. A detection circuit is configured to receive the sensor signal and to provide feedback to the user based at least upon the aspect of the user's movement.
Description
BACKGROUND
Runner’s often get injuries due to poor form and/or repetitive stress. In particular·, a runner’s stride and“impact” may be correlated with the chances of developing an injury. There exists a need to assess a runner’s style and provide feedback so the runner may alter his/her running style to reduce the risk of injury.
A runner’s ground reaction force varies with the runner, and is a measure of force (over time) as the runner’s foot first makes contact with the ground, as the runner’s weight moves over the foot, and as the runner pushes off the ground for the next step. For example, a first peak in the ground reaction force is an impact peak, and may be the point of the largest stress on the foot, ankle, shin, etc. The magnitude and timing of the first peak may play a significant role in the risk of injury. A second peak in the ground reaction force may represent an active force (e.g., an active peak) of the runner pushing off the ground for the next step, and may also be indicative of the risk of injury.
Many runners use headphone systems while running, and for various purposes, such as listening to audio content (e.g., music, talk), communications (e.g., telephone calls), and/or noise reduction. Headphone systems may therefore provide a platform for advantageous use as a feedback device.
SUMMARY OF THE INVENTION
Aspects and examples are directed to headphone systems and methods that detect aspects of a user’s activity, such as running, and may provide feedback or instruction to the user about the user’s performance. Feedback or instruction may be provided through an audio message or alert played through the headphones, or by visual indicators associated with the headphones.
According to one aspect, a headphone system is provided that includes an earpiece, an acoustic driver coupled to the earpiece to render an audio signal, a sensor to provide a sensor signal indicative of an aspect of the user’s movement, and a detection circuit configured to receive the sensor signal and to provide a feedback based at least upon the aspect of the user’ s movement.
In various examples, the sensor may be a microphone to detect an environmental acoustic signal indicative of the user’s impact with the ground. The sensor may be an inertial sensor to detect accelerations associated with the user’s impact with the ground. The sensor may be an impact sensor to detect forces associated with the user’s impact with the ground. The sensor may be an electromyography sensor to detect muscle activity associated with the user’s stride.
In various examples, the feedback may include any of an audible cue injected into the audio signal, a positive reinforcement, a negative reinforcement, an instruction, a warning, and/or a pacing signal. The detection circuit may be configured to increase or decrease a rate of a pacing signal based upon the aspect of the user’ s movement.
In certain examples, the feedback includes a visible indication.
According to another aspect, a method of providing feedback to a headphone user is provided that includes receiving an audio signal to be converted to an acoustic signal, receiving a sensor signal indicative of an aspect of the user’s movement, analyzing the sensor signal to detect the aspect of the user’s movement, providing feedback based upon the aspect of the user’s movement.
Providing feedback may include modifying the audio signal and rendering the modified audio signal into the acoustic signal. Modifying the audio signal may include injecting into the audio signal at least one of a positive reinforcement, a negative reinforcement, an instruction, a warning, or a pacing signal.
In various examples, the sensor signal may include one or more of an acoustic signal, a microphone signal, an accelerometer signal, an impact signal, a force indicating signal, and an electromyography signal. The sensor signal may be indicative of the user’s impact, pace, or stride.
Still other aspects, examples, and advantages of these exemplary aspects and examples Eire discussed in detail below. Examples disclosed herein may be combined with other examples in any manner consistent with at least one of the principles disclosed herein, and references to “an example,”“some examples,”“an alternate example,”“various examples,”“one example” or the like are not necessarily mutually exclusive and are intended to indicate that a particular feature, structure, or characteristic described may be included in at least one example. The appearances of such terms herein are not necessarily all referring to the same example.
BRIEF DESCRIPTION OF THE DRAWINGS
Various aspects of at least one example are discussed below with reference to the accompanying figures, which are not intended to be drawn to scale. The figures are included to provide illustration and a further understanding of the various aspects and examples, and are incorporated in and constitute a part of this specification, but are not intended as a definition of the limits of the invention. In the figures, identical or nearly identical components illustrated in various figures may be represented by a like numeral. For purposes of clarity, not every component may be labeled in every figure. In the figures;
FIG. 1 is a perspective view of an example headphone set;
FIG. 2 is a left-side view of an example headphone set;
FIG. 3 is a schematic diagram of an example headphone set;
FIG. 4 is a flow chart of an example method that may be carried out by a headphone set; and
FIG. 5 is a schematic diagram of a signal processing method that may be earned out by a headphone set.
DETAILED DESCRIPTION
Aspects and examples are directed to headphone systems and methods that detect aspects of a user’s activity, such as running. Aspects of running may include the user’s stride, gate, impact, e.g., vertical ground reaction force, heel-toe characteristics, and the like. Headphone systems and methods disclosed herein may provide feedback or instruction to the user about the user’s performance, and in particular examples may provide feedback or instruction intended to reduce the possibility of injury due to improper form or repetitive stress, for example. Feedback or instruction may be provided through an audio message or alert played through the headphones, or by visual indicators associated with the headphones.
Throughout this disclosure the terms“headset,”“headphone,” and“headphone set” are used interchangeably, and no distinction is meant to be made by the use of one term over another unless the context clearly indicates otherwise. Additionally, aspects and examples in accord with those disclosed herein may be applied to earphone form factors (e.g., in-ear transducers, earbuds) and/or off-ear acoustic devices (e.g., devices that are designed to not
contact a wearer’s ears, but are worn in the vicinity of the wearer’s ears, on the head or body, e.g., shoulders) and such are also contemplated by the terms“headset,”“headphone,” and “headphone set.” Accordingly, any on-ear, in-ear, over-ear, or off-ear form-factors of personal acoustic devices are intended to be included by the terms “headset,” “headphone,” and “headphone set.” The term“earpiece” is intended to include any portion of such form factors in proximity to at least one of a user’s ears.
FIG. 1 illustrates one example of a headphone set. The headphones 100 include two earpieces, e.g., a right earcup 102 and a left earcup 104, coupled to a right yoke assembly 108 and a left yoke assembly 110, respectively, and intercoupled by a headband 106. The right earcup 102 and left earcup 104 include a right circumaural cushion 112 and a left circumaural cushion 114, respectively. Visible on the left earcup 104 is a left interior surface 116. While the example headphones 100 are shown with earpieces having circumaural cushions to fit around or over the ear of a user, in other examples cushions may sit on the ear, or may include earbud portions that protrude into a portion of a user’s ear canal, or may include alternate physical arrangements, as discussed above. As discussed in more detail below, either of the earcups 102, 104 may include one or more sensors, such as microphones, inertial sensors (e.g., accelerometer, gyroscope, compass), radio receivers, etc. Although the example headphones 100 illustrated in FIG. 1 include two earpieces, some examples may include only a single earpiece for use on one side of the head only. Additionally, although the example headphones 100 include a headband 106, other examples may include different support structures to maintain one or more earpieces (e.g., earcups, in-ear structures, neckband, etc.) in proximity to a user’s ear, e.g., an earbud may include a shape and/or materials configured to hold the earbud within a portion of a user’s ear, or a personal speaker system may include a neckband to support and maintain acoustic driver! s) near the user’s ears, shoulders, etc.
FIG. 2 illustrates multiple example placements of sensors on the headphones 100, any one or more of which may be included in certain examples. FIG. 2 illustrates the headphones 100 from the left side and shows details of the left earcup 104 including a pair of front sensors 202, which may be nearer a front edge 204 of the earcup, and a rear sensor 206, which may be nearer a rear edge 208 of the earcup. The right earcup 102 may additionally or alternatively have a similar arrangement of front and rear sensors, though in examples the two earcups may have a differing arrangement in number or placement of sensors. Additionally, various
examples may have more or fewer sensors 202, 206 in various placement about a headphone, which may include sensors on the headband 106, a neckband, chin strap, etc., and in some examples sensors may be provided as an accessory sensor worn elsewhere on the user’s body, such as in a shoe or on a waistband, for instance. While not specifically illustrated in FIGS. 1 and 2, one or more acoustic drivers may be provided in the right and/or left earcups 102, 104 to provide audio playback to the user.
The sensors 202, 206 may be of various types, such as acoustic sensors (e.g., microphones, ultrasonic, sonar systems, etc.), inertial sensors, electromagnetic or radio sensors (e.g., radio receivers, radar devices, etc.), and may be coupled to additional sensor components, e.g., a force plate in a shoe may transmit a signal to one of the sensors 202, 206. In some examples, only one sensor may be necessary.
FIG. 3 is a schematic block diagram of an example headphone system 300, such as for the headphones 100. Hie headphone system 300 includes a controller 310 that provides signals to acoustic drivers 320 for audio playback (e.g., right driver 320a, left driver 320b). The controller 310 includes a processor 312, an audio interface 314, and may include a battery 316 and/or additional components. The audio interface 314, for example, may be a wired input or may be a wireless interface configured, at least in part, to receive program content signals for audio playback. The controller 310 may also receive signals from various sensors, including the sensors 202, 206 (see FIG. 2), for example. In some examples, one or more of the sensors 202, 206 may be acoustic sensors, such as microphones, and may provide microphone signals to be analyzed by the processor 312 to determine aspects of the user’s performance. In various examples, one or more of the sensors 202, 206 may be an inertial sensor, to sense movement of the user’s head for the processor 312 to determine the user’s stride, impact, etc. In various examples, one or more of the sensors 202, 206 may be a radio or other electromagnetic sensor (e.g., antenna, Bluetooth™ receiver, etc.) and may be coupled with a further sensor component, such as a shoe mounted accelerometer or force sensor, wrist-worn sensor, waistband sensor, and/or other suitable sensors for detecting aspects of the user’s movement.
In some examples, one or more of the sensors 202, 206 may be microphones whose signals are processed by the processor 312 to detect the sound of the user’s stride and determine an estimate of the user’s ground reaction force. In some examples, various techniques may be applied to isolate the sound of the user’s feet contacting the ground. For example, array
processing of the microphone signals may direct a beam toward the user’s feet, and/or frequency band filtering may be applied to process only portions of the spectrum expected to have content related to the user’s feet interacting with the ground, to name a few. In some examples, one or more of the sensors 202, 206 may be inertial sensors whose signals may be
processed by the processor 312 to detect motion indicative of the user’s stride (e.g., up and down motion, timing of steps, strength of impact) and to determine an estimate of the user’s ground reaction force. In some examples, various sensor types may provide signals that are analyzed in parallel, and results combined and/or correlated (e.g., for enhanced precision and/or confirmatory purposes), to determine an estimate of the user’s ground reaction force. For example, inertial sensor signals and microphone signals may be processed as discussed above and correlated with each other, to confirm timing and/or impact of the user’s activity, or combined in a manner to increase precision. In some examples, one or more accessory sensors may provide sensor signals to be processed to determine an estimate of the user’s ground reaction force. For example, a shoe insert may measure force directly and may transmit a sensor signal to the headphones. In a further example, an inertial sensor worn elsewhere on the body (e.g., ankle-band, wrist-band, waistband) may transmit a sensor signal to the headphones. In further examples, a cardiologic monitor (e.g., heart-rate, blood pressure), electromyography sensor (e.g., muscle activity), or other physiological monitor may be worn by the user and may provide signals to the headphones. Any one or more of such sensor signals, individually or in combination, may be processed by the processor 312 to determine aspects of the user’s physical activity upon which to base feedback and/or instruction to be provided to the user.
With reference to ground reaction force, actual or estimated, an impact peak may be a peak force indicative of the initial‘landing” of the user’s foot, and a slope of a ground reaction force curve may be indicative of what the impact peak will be, prior to the peak occurring. Either of the actual impact peak value, or the slope of the ground reaction force leading up to the peak, or a combination of the two, may be used to evaluate whether the user is moving in such a way as to increase or decrease risk of injury. Evaluation of other portions of the ground reaction force (e.g., pushing off, or an active peak) may also be used to evaluate form and/or risk of injury in various examples. Further in various examples, other measurements not related to ground reaction force may be used to evaluate form and/or risk of injury.
FIG. 4 illustrates an example method 400 that may be implemented by various headphone systems, such as the headphones 100 and the headphone system 300. A controller or processing system, such as the controller 310, receives signals from one or more sensors (block 410), and analyzes the signals (block 420) to detect a condition of the user (block 430), e.g., to evaluate or determine an aspect of the user’s physical motion, form, and/or risk of injury. Upon evaluation (which may optionally be characterized into various levels of significance, in some examples), the controller or processing system may provide feedback (block 440) or instruction to the user.
For example, the controller or processing system may inject an audio signal to be rendered by an acoustic driver to provide a message, such as“speed up,”“slow down,”“shift weight forward,” “shift weight back,” or any of various messages as determined by an appropriate program function taking into account various best practices for, e.g., running form. In some examples, an audible feedback may include a beep or tick or other indicator that indicates a proper timing, e.g., for running stride. In some examples, an audible feedback may include a harsh tone when an impact is determined to exceed a threshold, and/or a pleasant tone when an impact is determined to remain below' a threshold, for instance. In some examples, a visible feedback may be provided, such as a flashing light source that indicates a proper or improper timing, stride, or impact force (e.g., above or below' a threshold). A visible feedback may include color to indicate feedback to the user, such as a green light when the user is performing within an acceptable parameter range and a yellow or red light when outside the acceptable range, for example. Various color options may indicate to speed up, slow' down, or maintain pace across various examples. Various color options may indicate to shift weight forward, back, or maintain weight balance across various examples. In various examples, any variation or combination of audible and visible cues may be provided to provide feedback or instruction to the user.
In some examples, feedback to be provided may be based upon the detected user’s condition (block 430) in combination with a particular program and/or user profile. For example, a user’s height, weight, body mass index, body mass proportions, gender, user condition (e.g., shin-splints, susceptibility, etc.) and/or other physical parameters) may be configurable and stored in a memory associated with a controller, e.g., the controller 310, and may be used to determine what feedback to be provided, given various sensor signals. A
program may include various settings, such as running surface characteristics, whether the activity is for cardio or other metabolic purposes, training for speed or distance, total time, risk tolerance, and/or other various parameters, in various examples.
In some examples, a controller or processing system, such as the controller 310, may be programmed or trained to provide proper feedback or instruction to a user through artificial intelligence and/or machine learning processes, and such programming or training may be performed at manufacture for a general or average user, or may be customized by the user and/or performed through a procedure executed by the user. In some examples, a user’s proper form may be exhibited (or performed) while wearing the headphones, and the controller 310 may‘learn” the proper sensor signals that indicate the proper performance. For example, inertial sensor signals and/or the sound of the user’s feet (via microphone, acoustic sensors) may be monitored and characterized such that proper form, and improper forms, may be determined in the future by comparison to the“learned,” characterized, or exemplary sensor signals.
In some examples, a controller such as the controller 310 may process signals from various sensors (e.g., sensors 202, 206) in a manner to provide enhanced and/or reduced response in certain directions, e.g., via array processing techniques. For example, and as illustrated in FIG. 5, an example processing method 500 is shown. The sensors 202, 206 may provide two or more individual signals 502 to be combined with array processing, e.g., by the controller 310, to implement a beam former 510 to produce a signal 512 having enhanced response in a particular beam direction. Additionally or alternatively, two or more of the individual signals 502 may be combined with array processing, e.g., by the controller 310, to implement null steering 520 to produce a signal 522 having reduced response in a particular null direction. The signal 512 may be produced with an enhanced response in a selected direction, such as toward the feet of the user, for example, to enhance the likelihood of detection of running acoustics, e.g., foot impact. In some examples, the signal 522 may be a reference signal. For example, to confirm that an acoustic sound is coming from a certain direction (e.g., the user’s foot, as directed by the beam former 510), the signal 522 may be formed by null steering 520 to have reduced response in the direction of the user's foot. Accordingly, a signal component present in the signal 512 but absent from the signal 522 may provide confidence that the signal component is originating in the direction of the user's foot.
In various examples, signals from acoustic sensors (microphones) may be array processed in a manner illustrated by the example shown in FIG. 5, while in broader examples signals of various types may be array processed in an analogous manner.
In various examples, any of the functions or methods, and any components of systems (e.g., the controller 310), described herein may be implemented or carried out in a digital signal processor (DSP), a microprocessor, a logic controller, logic circuits, and the like, or any combination of these, and may include analog and/or digital circuit components and/or other components with respect to any particular implementation. Functions and components disclosed herein may operate in the digital domain and certain examples include analog-to- digital (ADC) conversion of analog signals provided, e.g., by microphones, despite the lack of illustration of ADC’s in the various figures. Any suitable hardware and/or software, including firmware and the like, may be configured to carry out or implement components of the aspects and examples disclosed herein, and various implementations of aspects and examples may include components and/or functionality in addition to those disclosed.
Examples disclosed herein may be combined with other examples in any manner consistent with at least one of the principles disclosed herein, and references to“an example,”
“some examples “an alternate example,”“various examples,’ “one example” or the like are not necessarily mutually exclusive and are intended to indicate that a particular feature, structure, or characteristic described may be included in at least one example. The appearances of such terms herein are not necessarily all referring to the same example.
It is to be appreciated that examples of the methods and apparatuses discussed herein are not limited in application to the details of construction and the arrangement of components set forth in the following description or illustrated in the accompanying drawings. The methods and apparatuses are capable of implementation in other examples and of being practiced or of being carried out in various ways. Examples of specific implementations are provided herein for illustrative purposes only and are not intended to be limiting. Also, the phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting. The use herein of “including,”“comprising,”“having,” “containing,”“involving,” and variations thereof is meant to encompass the items listed thereafter and equivalents thereof as well as additional items. References to“or” may be construed as inclusive so that any terms described using“or” may indicate any of a single, more than one, and all of the described
terms. Any references to front and back, left and right, top and bottom, tipper and lower, and vertical and horizontal are intended for convenience of description, not to limit the present systems and methods or their components to any one positional or spatial orientation.
Having described above several aspects of at least one example, it is to be appreciated various alterations, modifications, and improvements wall readily occur to those skilled in the art. Such alterations, modifications, and improvements are intended to be part of this disclosure and are intended to be within the scope of the invention. Accordingly, the foregoing description and drawings are by way of example only, and the scope of the invention should be determined from proper construction of the appended claims, and their equivalents.
Claims
1. A headphone system, comprising:
an earpiece;
an acoustic driver coupled to the earpiece to render an audio signal;
a sensor to provide a sensor signal indicative of an aspect of the user’s movement: and a detection circuit configured to receive the sensor signal and to provide a feedback based at least upon the aspect of the user’s movement.
2. The headphone system of claim 1 wherein the sensor is a microphone to detect an environmental acoustic signal indicative of the user’s impact with the ground.
3. The headphone system of claim 1 wherein the sensor is an inertial sensor to detect accelerations associated with the user’s impact with the ground.
4. The headphone system of claim 1 wherein the sensor is an impact sensor to detect forces associated with the user’s impact with the ground.
5. The headphone system of claim 1 wherein the sensor is an electromyography sensor to detect muscle activity associated with the user’s stride.
6. The headphone system of claim 1 wherein the feedback is at least one of a positive reinforcement, a negative reinforcement, an instruction, or a warning.
7. The headphone system of claim 1 wherein the feedback is an audible cue injected into the audio signal.
8. The headphone system of claim 7 wherein the audible cue includes a pacing signal, the detection circuit configured to increase or decrease a rate of the pacing signal based upon the aspect of the user’ s movement.
9. The headphone system of claim 1 wherein the feedback includes a visible indication.
10. A method of providing feedback to a headphone user, the method comprising:
receiving an audio signal to be converted to an acoustic signal;
receiving a sensor signal indicative of an aspect of the user’s movement;
analyzing the sensor signal to detect the aspect of the user’s movement;
providing feedback based upon the aspect of the user's movement.
11. The method of claim 10 wherein providing feedback includes modifying the audio signal and rendering the modified audio signal into the acoustic signal.
12. The method of claim 11 wherein modifying the audio signal includes injecting into the audio signal at least one of a positive reinforcement, a negative reinforcement, an instruction, a warning, or a pacing signal.
13. The method of claim 10 wherein the sensor signal includes at least one of an acoustic signal, a microphone signal, an accelerometer signal, an impact signal, a force indicating signal, and an electromyography signal .
14. The method of claim 10 wherein the sensor signal is indicative of the user’s impact with the ground.
15. The method of claim 10 wherein the sensor signal is indicative of at least one of the user’s impact, pace, or stride.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US201862646515P | 2018-03-22 | 2018-03-22 | |
| US62/646,515 | 2018-03-22 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2019183223A1 true WO2019183223A1 (en) | 2019-09-26 |
Family
ID=66223797
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/US2019/023169 Ceased WO2019183223A1 (en) | 2018-03-22 | 2019-03-20 | Audio coach for running injury protection |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2019183223A1 (en) |
Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5221088A (en) * | 1991-01-22 | 1993-06-22 | Mcteigue Michael H | Sports training system and method |
| US20020107649A1 (en) * | 2000-12-27 | 2002-08-08 | Kiyoaki Takiguchi | Gait detection system, gait detection apparatus, device, and gait detection method |
| US20060251334A1 (en) * | 2003-05-22 | 2006-11-09 | Toshihiko Oba | Balance function diagnostic system and method |
| JP2013215220A (en) * | 2012-04-04 | 2013-10-24 | Asahi Kasei Corp | Walking condition detecting device |
| US20160077547A1 (en) * | 2014-09-11 | 2016-03-17 | Interaxon Inc. | System and method for enhanced training using a virtual reality environment and bio-signal data |
| US20160073951A1 (en) * | 2011-07-13 | 2016-03-17 | Philippe Richard Kahn | Sleep Monitoring System |
| US20160263437A1 (en) * | 2014-08-26 | 2016-09-15 | Well Being Digital Limited | A gait monitor and a method of monitoring the gait of a person |
| CN106166071A (en) * | 2016-07-04 | 2016-11-30 | 中国科学院计算技术研究所 | The acquisition method of a kind of gait parameter and equipment |
-
2019
- 2019-03-20 WO PCT/US2019/023169 patent/WO2019183223A1/en not_active Ceased
Patent Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5221088A (en) * | 1991-01-22 | 1993-06-22 | Mcteigue Michael H | Sports training system and method |
| US20020107649A1 (en) * | 2000-12-27 | 2002-08-08 | Kiyoaki Takiguchi | Gait detection system, gait detection apparatus, device, and gait detection method |
| US20060251334A1 (en) * | 2003-05-22 | 2006-11-09 | Toshihiko Oba | Balance function diagnostic system and method |
| US20160073951A1 (en) * | 2011-07-13 | 2016-03-17 | Philippe Richard Kahn | Sleep Monitoring System |
| JP2013215220A (en) * | 2012-04-04 | 2013-10-24 | Asahi Kasei Corp | Walking condition detecting device |
| US20160263437A1 (en) * | 2014-08-26 | 2016-09-15 | Well Being Digital Limited | A gait monitor and a method of monitoring the gait of a person |
| US20160077547A1 (en) * | 2014-09-11 | 2016-03-17 | Interaxon Inc. | System and method for enhanced training using a virtual reality environment and bio-signal data |
| CN106166071A (en) * | 2016-07-04 | 2016-11-30 | 中国科学院计算技术研究所 | The acquisition method of a kind of gait parameter and equipment |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20200267487A1 (en) | Dynamic spatial auditory cues for assisting exercise routines | |
| US12212912B2 (en) | Wearable audio device placement detection | |
| US20240121548A1 (en) | Stand-alone multifunctional earphone for sports activities | |
| US9949008B2 (en) | Reproduction of ambient environmental sound for acoustic transparency of ear canal device system and method | |
| CN111448803B (en) | Method for detecting wearing and taking off of electronic equipment, earphone and readable storage medium | |
| US10382861B2 (en) | Earphone device with optical sensor | |
| US10873813B2 (en) | Method and apparatus for audio pass-through | |
| TWI572215B (en) | Earphone structure | |
| US20200396529A1 (en) | Bone conduction headset | |
| WO2017113768A1 (en) | Control method and control system of earphone, and earphone | |
| US20210241782A1 (en) | Personal Audio Device | |
| US12198668B2 (en) | Acoustic apparatus and acoustic control method | |
| WO2019183225A1 (en) | Modifying audio based on situational awareness needs | |
| JPWO2005053800A1 (en) | Communication system using bone conduction speaker | |
| WO2019183223A1 (en) | Audio coach for running injury protection | |
| JP2024147069A (en) | Earphones and hearing aids | |
| CN116935874A (en) | Helmet noise reduction method and device, helmet and computer readable storage medium | |
| JP7787102B2 (en) | Wearable Audio Device Placement Detection | |
| CN112911486A (en) | Wireless earphone, method for detecting in-ear state of wireless earphone and storage medium | |
| JP2016021605A (en) | Equipment control device | |
| WO2024010795A1 (en) | Wearable audio device placement detection | |
| WO2011083306A1 (en) | Heart rate monitor |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19718481 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 19718481 Country of ref document: EP Kind code of ref document: A1 |