[go: up one dir, main page]

WO2025183907A1 - Smart footwear and system for identifying body events - Google Patents

Smart footwear and system for identifying body events

Info

Publication number
WO2025183907A1
WO2025183907A1 PCT/US2025/015687 US2025015687W WO2025183907A1 WO 2025183907 A1 WO2025183907 A1 WO 2025183907A1 US 2025015687 W US2025015687 W US 2025015687W WO 2025183907 A1 WO2025183907 A1 WO 2025183907A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
event
pressure
features
subject
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/US2025/015687
Other languages
French (fr)
Inventor
Cameron A. MERTZ
Paul Knight
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
WL Gore and Associates Inc
Original Assignee
WL Gore and Associates Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by WL Gore and Associates Inc filed Critical WL Gore and Associates Inc
Publication of WO2025183907A1 publication Critical patent/WO2025183907A1/en
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
    • A61B5/112Gait analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/1036Measuring load distribution, e.g. podologic studies
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/1036Measuring load distribution, e.g. podologic studies
    • A61B5/1038Measuring plantar pressure during gait
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
    • A61B5/1116Determining posture transitions
    • A61B5/1117Fall detection
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
    • A61B5/1123Discriminating type of movement, e.g. walking or running
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • A61B5/6804Garments; Clothes
    • A61B5/6807Footwear
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0247Pressure sensors

Definitions

  • the present disclosure relates generally to garments with sensors and associated computer systems. More specifically, the disclosure relates to footwear and computer systems for identifying body events of users wearing the footwear.
  • Wearable electronics and smart garments or apparel are increasingly popular. These smart garments, which include sensors and other electronic components, can be used to collect a wide range of information about the user wearing the garment. Examples of such information include physiologic information, such as the pulse rate and oxygen saturation of a wearer. There remains, however, a continuing need for improved smart garments and associated systems for processing data collected by the smart garments. Garments and systems of these types that are capable of accurately identifying a wider range of information amount the wearer would be especially desirable.
  • Smart garments and associated computer systems and methods in accordance with the disclosed examples may provide a number of advantages. For example, they are capable of efficiently and accurately providing useful insights into a wide range of activities and physiologic conditions of a subject.
  • One example is a method for operating a computing system including one or more processors to determine body events of a subject.
  • Embodiments of this example may comprise: receiving, by the one or more processors, data from a plurality of sensors, including data from a plurality of pressure sensors mounted to at least one article positioned on a bottom of a foot of the subject, wherein the data includes first pressure data from a first pressure sensor at a front inside foot position, second pressure data from a second pressure sensor at a front outside foot position, and third pressure data from a third pressure sensor at a rear foot position; processing the data, by the one or more processors, to determine the body event of the subject, including: identifying one or more features associated with the body of the subject based upon the data from the plurality of sensors, including the plurality of pressure sensors mounted to the at least one article; and classifying the one or more features as a body event of the subject, wherein the body event is at least one of a standing event, a weight carrying event, a motion event, a body
  • identifying the one or more features may include one or more of: (i) determining a sum of two or more of the first pressure data, the second pressure data, and the third pressure data, for example to determine a total pressure of the foot of the subject; and/or (ii) determining a difference between two or more of the first pressure data, the second pressure data, and the third pressure data, for example to determine a distribution of pressure on the foot of the subject. Identifying the one or more features may also include one or more of: (i) determining i. and/or ii. above at one time, for example a static pressure determination; and/or (ii) determining i. and/or ii. above over a non-zero period of time, for example a dynamic or temporal changes pressure determination.
  • the method may further comprise receiving, by the one or more processors, weight data representative of a weight of the subject; identifying the one or more features comprises determining a weight, optionally including summing the first, second and third pressure data; and classifying the one or more features comprises classifying the one or more features as a standing event, including optionally a leaning event when the determined weight is representative of a weight that is at least as great as the weight of the subject.
  • the method may further comprise receiving, by the one or more processors, weight data representative of a weight of the subject, and identifying the one or more features comprises determining a weight, optionally including summing the first, second and third pressure data; and classifying the one or more features comprises classifying the one or more features as a weight carrying event when the determined weight is representative of a weight that is greater than the weight of the subject.
  • receiving data from a plurality of sensors may comprise: receiving first article data from a plurality of pressure sensors mounted to an article positioned on a bottom of a first foot of the subject; and receiving second article data from a plurality of pressure sensors mounted to an article positioned on a bottom of a second foot of the subject; identifying the one or more features comprises identifying temporal changes between the first article data and the second article data; and classifying the one or more features comprises classifying the one or more features as a motion event based on the identified temporal changes between the first article data and the second article data.
  • classifying the one or more features may comprise classifying the one or more features as a walking event, a jogging/running event, a biking event, or a dancing event.
  • identifying the one or more features may comprise identifying one or more gait parameters, wherein the one or more gait parameters includes one or more of velocity or speed, stride length, stride duration, stance time/percentage, swing time/percentage, step time, step width, step asymmetry, or ground reaction force/pressure.
  • identifying one or more features may comprise determining differences between two or more of the first pressure data, the second pressure data, and the third pressure data; and classifying the one or more features comprises classifying the one or more features as a body position event based on the determined differences between the two or more of the first pressure data, the second pressure data, and the third pressure data.
  • classifying the one or more features as a body position event may comprise classifying the one or more features as a forward leaning position, a rearward leaning position, or a sideway leaning position.
  • classifying the one or more features may comprise classifying the one or more features as a safety event.
  • receiving the data may comprise receiving the data from a plurality of sensors mounted to at least one of a sock, a footwear insole, or a footwear article such as a shoe or boot.
  • receiving the data further comprises receiving data from one or more additional sensors, wherein the one or more additional sensors comprises one or more of an accelerometer, a PPG sensor, a light sensor, a heart sensor, a location sensor, a bioimpedance sensor, and EMG sensor, a GPS sensor, and environmental sensor, a bend sensor, a stretch sensor, mounted to the article.
  • identifying features and/or classifying features comprises identifying and/or classifying features based upon the data received from the one or more additional sensors.
  • processing the data may comprise processing the data by a trained model.
  • processing the data comprises processing the data by a trained artificial neural network.
  • Embodiments may further comprise training the trained model using body event calibration data.
  • Another example comprises one or more processors configured to perform the method of any or all of the exemplary embodiments described above.
  • Yet another example is one or more garments such as a sock, a footwear insole, or a footwear article such as a shoe or boot, configured to provide the sensor data of any or all of the exemplary embodiments described above.
  • FIG. 1 is a diagrammatic illustration of a system including a footwear-type smart garment, shown for purposes of example as a sock, and a computer system, in accordance with embodiments.
  • FIG. 2 illustrates exemplary pressure data provided by pressure sensors during a walking event, in accordance with embodiments.
  • FIG. 3 illustrates exemplary pressure data provided by pressure sensors during a jogging/running event, in accordance with embodiments.
  • FIG. 4 illustrates exemplary pressure data provided by pressure sensors during a bicycle riding event, in accordance with embodiments.
  • FIG. 5 illustrates exemplary pressure data provided by pressure sensors during walking, jogging/running and bicycle riding events, in accordance with embodiments.
  • FIGs. 6A-6C illustrate exemplary acceleration and pressure data provided by sensors during a walking event, in accordance with embodiments.
  • FIGs. 7A-7C illustrate exemplary acceleration and pressure data provided by sensors during a jogging/running event, in accordance with embodiments.
  • FIGs. 8A-8C illustrate exemplary acceleration and pressure data provided by sensors during a bicycle riding event, in accordance with embodiments.
  • FIGs. 9A-9C illustrate exemplary acceleration and pressure data provided by sensors during steps of a walking event, in accordance with embodiments.
  • FIG. 10 is a diagrammatic illustration of a method for training a model to identify body events, in accordance with embodiments.
  • FIG. 11 is a diagrammatic illustration of a subject during a standing or leaning body event, in accordance with embodiments.
  • FIG. 12 is a diagrammatic illustration of a subject during a dancing body event, in accordance with embodiments.
  • FIG. 13 is a diagrammatic illustration of a subject during a standing, weight carrying body event, in accordance with embodiments.
  • FIG. 14 is a diagrammatic illustration of a subject during a walking, leaning body event, in accordance with embodiments.
  • FIG. 15 is a diagrammatic illustration of a subject during a squatting or crouching body event, in accordance with embodiments.
  • FIG. 16 is a diagrammatic illustration of a subject during a bicycle riding body event, in accordance with embodiments.
  • FIG. 17 is a diagrammatic illustration of a subject during a sitting body event, in accordance with embodiments.
  • FIG. 18 is a diagrammatic illustration of a subject during a walking or jogging/running body event, in accordance with embodiments.
  • FIG. 19 is a diagrammatic illustration of a subject during an upright standing body event, in accordance with embodiments.
  • FIG. 20 is a diagrammatic illustration of a method for using a model to identify body events, in accordance with embodiments.
  • FIG. 21 is a detailed diagrammatic illustration of a computer system such as that shown in FIG. 1 , in accordance with embodiments.
  • FIG. 1 is a diagrammatic illustration of a system 10 for identifying and classifying body events of a subject, in accordance with embodiments.
  • system 10 includes a footwear-type smart garment 12 coupled to a computer system 14.
  • Smart garment 12 includes an article, shown for example as a sock 16, that is configured to be worn or otherwise positioned adjacent to a foot on the body of a person or other subject (not shown in FIG. 1 ), and a plurality of sensors 18 mounted or otherwise coupled to the sock or foot of the subject.
  • FIG. 1 is a diagrammatic illustration of a system 10 for identifying and classifying body events of a subject, in accordance with embodiments.
  • system 10 includes a footwear-type smart garment 12 coupled to a computer system 14.
  • Smart garment 12 includes an article, shown for example as a sock 16, that is configured to be worn or otherwise positioned adjacent to a foot on the body of a person or other subject (not shown in FIG. 1 ), and a plurality of sensors 18 mounted or otherwise coupled to the
  • sensors 18 may include a shoe, boot, sandal or other footwear, or an insole or other member configured to be inserted into or otherwise attached to such footwear.
  • the sensors 18 generate sensor data associated with the subject wearing the sock 16.
  • sensors 18 include a plurality of pressure sensors, such as for example pressure sensors 20A, 20B and 20C (collectively referred to as pressure sensors 20).
  • sensors 18, including the pressure sensors 20, are coupled to a data transfer device 22 via transmission channels 24.
  • T ransmission channels 24 may be wired or wireless channels.
  • Data transfer device 22 which may be a wired or wireless data transmission device, transfers the sensor data from sensors 18, including the pressure data from the pressure sensors 20, to the computer system 14 via a network 26.
  • electrical conductors (not shown) can couple one or more of the plurality or sensors 18 to one or more others of the plurality of sensors and/or to the computer system 14.
  • the data transfer device 22 is shown as a component of the sock 16 in FIG. 1 , in other embodiments the data transfer device can be mounted to the subject’s body at other locations (e.g., on other elements of clothing), incorporated into the associated sensors 18, and/or can be located off of the subject.
  • Computer system 14 processes the sensor data, including the pressure data from pressure sensors 20, to identify and classify certain physical and/or physiologic activity of the subject wearing the smart garment 12.
  • Embodiments of the computer system 14 can determine certain body events of the subject. Examples of such body events include one or more of a standing event, a weight carrying event, a motion event, a body position event, or a safety event.
  • Standing events may, for example, include leaning (e.g., forward, rearward of sideway) events.
  • Motion events may, for example, include walking, jogging/running, bicycle riding or dancing events.
  • Body position events may, for example, include sitting, leaning, crouching or squatting events.
  • Safety events may include carrying excessive amounts of weight and/or possibly harmful body positions. Body events may also include several different body events (e.g., leaning while standing).
  • the body events can be determined by identifying features associated with the body of the subject based upon the data from sensors 18, including the pressure data from pressure sensors 20, and classifying those features as one or more different types of body events.
  • the computer system 14 processes the sensor data using trained machine learning models to identify and/or classify the body events of the subject.
  • the computer system 14 compares the sensor data to stored body event data representative of the body events to identify and classify, or otherwise determine the body events.
  • Sensor data provided by the smart garment 12 can also be used by the computer system 14 for certain set-up operations, such as to generate calibration and other data used to train the models and/or to generate the stored body event data used to determine the body events. Calibration data and trained models of these types effectively provide digital representations or models of the associated body events or other physical or physiological activities of the subject.
  • the sock 16 has portions located adjacent to a number of different portions or zones of the foot of the subject.
  • Sensor 20A for example, is located on the inside (e.g., right side of a left sock) and front portion of the sock 16 to provide pressure data on the inside of the front or ball of the subject’s foot.
  • Sensor 20B is shown located on the outside (e.g., left side of a left sock) and front portion of the sock 16 to provide pressure data on the outside of the front or ball of the subject’s foot.
  • Sensor 20C is shown located on the rear portion of the sock 16 to provide pressure data on the rear or heel of the subject’s foot.
  • Embodiments of the sock 16 may include additional sensors 18. The embodiments shown in FIG.
  • a stretch sensor 42 for example, include motion sensors 30A and 30B, a galvanic sensor 32, ambient condition sensor 34, electrocardiogram (ECG or EKG) heart sensor 36, photoplethysmography (PPG) sensor 38, impedance sensor 40 and a stretch sensor 42.
  • Other embodiments include alternative or additional sensors, such as for example an implantable sensor, a light sensor, a location (e.g., GPS) sensor, an electromyography (EMG) sensor, and/or a bend sensor.
  • the sensors 18 may be positioned on the inside of the sock 16 so as to contact the skin of the subject, on an outside of the sock, and/or embedded in the material or other substrate of the sock.
  • Motion sensors 30A and 30B are located to provide data representative of the pressure applied to different regions of the subject’s foot.
  • Motion sensors 30A and 30B (collectively referred to as motion sensors 30) which may include Inertial Measurement Units (IMUs) that may have an accelerometer, a gyroscope and/or a magnetometer, may provide motion data or information such as acceleration data, x- axis, y-axis, and/or z-axis movement data and direction or heading data.
  • IMUs Inertial Measurement Units
  • motion sensor 30A is positioned below the subject’s ankle
  • motion sensor 30B is positioned above the ankle.
  • the galvanic sensor 32 may provide data representative of skin conditions.
  • Ambient condition sensor 34 may provide data representative of conditions such as temperature or humidity.
  • Electrocardiogram sensor 36 may provide data regarding the electrical performance or characteristics of the subject’s heart.
  • Photoplethysmography sensor 38 may provide data representative of physiologic characteristics such as blood, heart rate, blood oxygenation and blood pressure.
  • Impedance sensor 40 may provide information representative of the bioimpedance of the subject’s skin of other tissue.
  • Stretch sensor 42 which is shown for example adjacent the heel and lower back portions of the sock 16, may provide data representative of a change in length or bend angle.
  • the sensors 18 may be commercially available or otherwise known devices.
  • FIG. 1 Although only one article such as sock 16 (a left foot sock) is shown in FIG. 1 , other embodiments include a second article or smart garment such as 12, e.g., right foot sock (not shown), coupled to the computer system 14 by the network 26.
  • a second sock or other article can for example be substantially the same as or similar to sock 16 and include sensors 20 such as a pressure sensors configured to provide data or information representative of pressures at various foot locations including the inside of the front or ball, the outside of the front or ball, and the rear or heel of the associated foot. Sensor data of information provided by such as second smart garment may enhance the efficiency and accuracy of body event determinations.
  • sensors 18 are positioned at other locations of the sock or body of the subject in other embodiments.
  • the locations of the sensors 18 may be determined based on factors such as optimization of signal strength provided, relevance of sensor data to the events, such as body events desired to be identified and classified, and comfort and fit with respect to the sock 16.
  • one sensor of many of the various different types, or more than one in the case of pressure sensors 20 and motion sensors 30, are shown for purposes of example, other embodiments include more or fewer sensors 18.
  • sensors 18 may be incorporated onto the sock 16 by approaches and structures such as pockets, adhesive, sewing and/or hook and loop fasteners.
  • the sensors 18 can be incorporated into a sensor harness such as that described in copending U.S. provisional application no. 63/442,886 filed on February 2, 2023, and entitled Electronics Harness for Smart Garments, or co-pending U.S. application no. 17/940,507, filed on September s, 2022, and entitled Garment including Electronic Devices, both of which are incorporated herein by reference in their entirety and for all purposes.
  • one or more, or all of the sensors 18 are wireless devices configured to communicate their associated sensor data to the data transfer device 22 on the sock 16 via the communication channels 24.
  • embodiments of systems in accordance with this disclosure may include a 0.5 inch diameter FSR 400 series force sensor available from Interlink Electronics (e.g., powered by a 3.3v supply potential).
  • a voltage divider circuit my be incorporated between a microprocessor and the sensor so an optimal voltage may be determined by an analog input of the microprocessor.
  • Laminated pressure/force sensors that can be incorporated into systems of these types are generally known and commercially available. Devices of these types may, for example, change resistance based upon changes in shape and/or pressure. Additionally and alternatively, capacitive type force/pressure sensors, which are also known and commercially available, can be incorporated into systems in accordance with this disclosure.
  • the data transfer device 22 may be in close proximity to the sensors 18, such as for example a mobile phone or device.
  • one or more, or all of the sensors 18 are wireless devices configured to communicate the associated sensor data directly to the computer system 14 via the communication channels 24.
  • the data transfer device 22 may include an electronic component configured to be coupled to one or more, or all of the sensors 18, for example by a releasable connector plug (not shown).
  • Such an electronic component may be configured to be coupled to the sensors 18 so as to facilitate electrical and mechanical connection of the electronic component and the disconnection of the electronic component from the sensors 18.
  • the electronic component may for example include a wireless transmitter to transmit sensor data from the sensors 18 to the computer system 14 via the network 26.
  • the electronic component may include electronic memory that stores the sensor data from the sensors 18, for download or other transfer to the computer system 14.
  • Exemplary connector plugs and electronic components of these types are disclosed, for example, in the above identified U.S. provisional application no. 63/442,886 that is incorporated herein.
  • Data transfer device 22 may also transfer data from the computer system 14 to one or more, or all of the sensors 18 in embodiments.
  • the data transfer device 22 may include an electronics module comprising processor, battery, antenna and/or memory, and may be configured to provide all or portions of the processing functionality of the computer system 14 (e.g., to perform the methods 100 and 300 described below).
  • the electronic components may be housed in an enclosure that is waterproof, and releasably attached to the sock 16 by, for example, one or more pockets, hook and loop patches, adhesive or fasteners.
  • 63/442,886 that is incorporated herein, for example, discloses structures and approaches, including pockets on bands, for releasably retaining data transfer devices such as 22 on the sock 16.
  • Advantages of releasably retaining all or portions of the data transfer device 22 on the sock 16 include wash isolation and reuse of the sock.
  • sensors 18 are described above as being mounted to, located on or otherwise configured for use in connection with the sock 16, other embodiments of system 10 include sensors that are configured for use with other structures.
  • auxiliary sensors may be mounted to or located on a removable ankle band or leg band, or pants worn by or positioned on the subject (not shown).
  • FIG. 2 illustrates exemplary pressure data or information provided by pressure sensors such as 20A, 20B and 20C on a foot of a subject (e.g., at locations corresponding to front inside, front outside and heel) while the subject is walking.
  • the data of FIG. 2, for example, may be representative of a normal walking pace over some uneven ground.
  • FIG. 3 illustrates exemplary pressure data or information provided by pressure sensors such as 20A, 20B and 20C on a foot of a subject while the subject is running/jogging.
  • the data of FIG. 3 may be representative of a normal or typical jogging pace on pavement, on level, uphill and downhill slopes.
  • FIG. 1 illustrates exemplary pressure data or information provided by pressure sensors such as 20A, 20B and 20C on a foot of a subject (e.g., at locations corresponding to front inside, front outside and heel) while the subject is walking.
  • the data of FIG. 2, for example, may be representative of a normal walking pace over some uneven ground.
  • FIG. 3
  • FIG. 4 illustrates exemplary pressure data or information provided by pressure sensors such as 20A, 20B and 20C on a foot of a subject while the subject is riding a bicycle.
  • FIG. 5 illustrates exemplary pressure data or information provided by pressure sensors such as 20A, 20B and 20C on a foot of a subject while the subject is walking, running/jogging and riding a bicycle.
  • the pressure data from sensors 20A, 20B and 20C is distinct and different from one another.
  • the pressure data from the sensors 20 is also distinct and different across the walking, running/jogging and bicycle riding events (e.g., the pressure data during the running/jogging event and the bicycle riding event are different than the pressure data during the walking events.
  • FIGs. 6A-6C illustrate exemplary and corresponding ankle acceleration data from motion sensors such as 30A and/or 30B, and pressure data from pressure sensors 20A, 20B and/or 20C, during a walking event.
  • FIG. 6A represents the ankle accelerations by axis (without particular orientation of the sensors).
  • FIG. 6B represents the combination or average, such as sum of squares, of the ankle accelerations shown in FIG. 6A.
  • FIG. 6C represents the corresponding pressure data from each of the sensors 20A, 20B and 20C.
  • the data of FIGs. 6A-6C may be representative of normal walking pace over some uneven ground. Among other information, a clear signal for steps, and flat foot landings, are evident from the data shown in FIGs. 6A-6C.
  • FIGs. 7A-7C illustrate exemplary and corresponding ankle acceleration data from motion sensors such as 30A and/or 30B, and pressure data from pressure sensors 20A, 20B and/or 20C, during a running/jogging event.
  • FIG. 7A represents the ankle accelerations by axis (without particular orientation of the sensors).
  • FIG. 7B represents the combination or average, such as sum of squares, of the ankle accelerations shown in FIG. 7A.
  • FIG. 7C represents the corresponding pressure data from each of the sensors 20A, 20B and 20C.
  • the data of FIGs. 7A-7C may be representative of a normal or typical jogging pace on pavement, on level, uphill and downhill slopes. Among other information, a clear signal for steps is evident from the data shown in FIGs. 7A-7C.
  • FIGs. 8A-8C illustrate exemplary and corresponding ankle acceleration data from motion sensors such as 30A and/or 30B, and pressure data from pressure sensors 20A, 20B and/or 20C, during a biking event.
  • FIG. 8A represents the ankle accelerations by axis (without particular orientation of the sensors).
  • FIG. 8B represents the combination or average, such as sum of squares, of the ankle accelerations shown in FIG. 8A.
  • FIG. 8C represents the corresponding pressure data from each of the sensors 20A, 20B and 20C.
  • the data of FIGs. 8A-8C may be representative of a relatively slow pace or effort. Among other information, a clear signal for cadence is evident from the data shown in FIGs. 8A-8C.
  • FIGs. 9A-9C illustrate detailed exemplary and corresponding ankle acceleration data from motion sensors such as 30A and/or 30B, and pressure data from pressure sensors 20A, 20B and/or 20C, during several steps of a running/jogging event.
  • FIG. 9A represents the ankle accelerations by axis (without particular orientation of the sensors).
  • FIG. 9B represents the combination or average, such as sum of squares, of the ankle accelerations shown in FIG. 9A.
  • FIG. 9C represents the corresponding pressure data from each of the sensors 20A, 20B and 200.
  • a clear signal for landings on the outside of the foot is evident from the data shown in FIGs. 9A-9C.
  • gait or stride parameters/positions can be determined and used in connection with the system and method described herein. These gait parameters may include one or more of velocity, step counting, stride length, stride duration, stance time I percentage, swing time I percentage, step time, step width, step asymmetry, and/or ground reaction force I pressure.
  • a subject’s stride can, for example, be characterized by a sequence of positions of one of their legs and feet during a gait cycle, where those positions include, in sequence, a heelstrike, footflat, midstance, pushoff, acceleration, midswing and deceleration.
  • the gait cycle of that foot can be defined as the distance between the locations of the heel at two sequential heelstrike positions.
  • Step length can be defined as the distance between the locations of the heels of the left and right feet at sequential heelstrike positions (e.g., approximately one- half of a gait cycle).
  • Step angle of each foot can be defined as the angle of a midline through the subject’s foot (e.g., from toes to heel) with respect to a centerline of the path they are moving along.
  • Step width for each of the feet can be defined as the distance between the centerline of the path, and a location on the midline of their foot (e.g., at the toes).
  • Embodiments of computer system 14 use one or more models to identify features of body events associated with sensor data provided by the sensors 18 of the subject wearing the smart garment 12, and to classify the identified features as representing one or more event types such as particular types of body events.
  • one or more of the models are machine learning models.
  • one or more of the models are artificial neural networks.
  • one or more of the models are statistical models.
  • the models may effectively be digital representations of the body events based upon sensor values that correspond to or characterize the body events.
  • Sensor data received from the sensors 18 is applied to and/or compared to the models to identify and classify features as being associated with one or more body events.
  • computer system 14 includes models characterizing body events and associated body event features for (1) a standing event, (2) a weight carrying event, (3) a motion event, (4) a body position event, and/or (5) a safety event.
  • body event features in connection with the models to identify and classify the body events.
  • Body event features may be portions of the sensor data from one or more of the sensors 18 that effectively define particular types or aspects of body events. For example, the magnitudes or intensities of the sensor data may define different body events.
  • the relative timing of activity defined by the sensor data either alone (e.g., from one sensor), or in combination with activity or lack of activity defined by other sensors, may be define different features of body events.
  • one or more of the body event features may be defined by the activities of a plurality of the sensors 18, including for example pressure sensors 20.
  • FIG. 10 is a diagrammatic illustration of a method 100 for training a machine learning model to identify body events.
  • the model uses body event features to classify and describe certain body events in accordance with embodiments.
  • the method 100 includes process 102 for collecting sets of calibration or training data, process 104 to filter the sets of training data, process 106 for contextualizing the sets of training data, process 108 for analyzing the sets of training data to determine body event features, process 110 for providing the sets of training data and features to a machine learning model for training, process 112 for generating predicted body events, process 114 for comparing the predicted body events with the stored, actual, and/or labeled body event associated with the training data, process 116 for adjusting parameters of the machine learning model, and process 118 for determining whether training of the machine learning model has been completed.
  • FIG. 10 illustrates a selected group of processes for the method 100
  • some of the processes may be expanded and/or combined. Other processes may be inserted. Some of the processes may be optional and need not be performed. Depending upon the embodiment, the sequence of the processes may be interchanged with others replaced.
  • some or all of the processes 102, 104, 106, 108, 110, 112, 114, 116 and 118 are performed by a computing device or a processor directed by instructions stored in memory.
  • some or all processes of the method 100 are performed according to instructions stored in a non-transitory computer-readable medium.
  • process 108 and 110 are in the order and describe training classical supervised machine learning model types, such as KNN or decision trees, the order may be switched for an artificial neural network or deep learning approach. While process 108 determines body event features as variables in classical models, in the event of a convolutional neural network, the contextual data may be directly presented to the classifier for it to determine features such as hidden or pooling layers.
  • each of the sets of training data includes sensor data relating to body event features of the associated body events, including data from pressure sensors 20, and knowledge of the types of those body events (e.g., the training data sets are labeled with the associated type of body event).
  • the one or more sets of training data can be stored in a memory element as stored body event data for later comparison with a body event of a subject.
  • the calibration or training data collected at process 102 may be sensor data from the sensors 18 when the sensors are at one or more known states or body poses or positions corresponding to the body events.
  • the known states or body poses may be static states corresponding to predetermined positions and associated physiologic data of the body of the subject wearing one or more socks 16 during the body events.
  • the known states may be dynamic or time-changing states corresponding to movement of the body of the subject and associated physiologic data during the associated body events.
  • a subject For example, a subject’s left and right foot at each of a plurality of gait or stride stages through a walking event can be identified.
  • the right foot heelstrike, flatfoot, midstance pushoff, acceleration, midswing and deceleration stages are identified.
  • Sensor data from sensors 18, including data from each of pressure sensors 20A, 20B and 20C from one or two socks 16 can be determined at these and other stages of the walking event (e.g., a type of motion event) for a subject, and used as training data.
  • Sensor data collected from the sensors 18 when the subject is at gait or stride positions may be considered baseline position sensor data, and may be used for calibration or reference.
  • training data may also be collected from the sensors 18 when the subject is in other static positions or poses, such as for example (1) when they are standing upright, and optionally not moving (e.g., a standing event), (2) when they are holding or otherwise carrying an item (a weight carrying event), (3) when they are jogging/running or riding a bicycle (other examples of motion events), (4) when they are leaning (e.g., at their waist), such as for example forward, backward or sideways, or sitting or lying (body position events), and/or (5) combinations of one or more of (1) - (4), such as for example a weight carrying event simultaneously with a body position event (safety events).
  • training data can also be collected from the sensors 18 when the subject is at a plurality of static positions or poses for each of the body events that the system 10 is configured to identify and classify.
  • the calibration data can be collected from the sensors 18 when the subject is at two or more of a sequence or series of positions or poses they typically have during the associated body events (e.g., the body event is represented by a sequence of static poses corresponding to poses of an actual body event.
  • gait or stride parameter characteristics such as those described above can be determined from one or more of the socks 16 of a subject, and used to identify features of the subject’s body events.
  • the training data collected during the process 102 may be used by the computer system 14 to characterize or “understand” the orientation of sensors 18 with respect to unique geometry of the subject or other subject providing the training or calibration information.
  • the training data may be used to adjust or compensate for particular or unique fits of the sock 16 on the subject.
  • the subject may orientate during the process 102 in a known direction, such as for example north. Accuracy of the training data generated during step 102 may be enhanced, for example to compensate for drift in the sensors 18, if the subject periodically recalculates to the known orientation.
  • Training data collected during the process 102 may also include the weight of the subject.
  • Certain body events such as for example weight carrying events and safety events, can be determined when the sensors 18, including the pressure sensors 20, provide information used by the computer system 14 to determine that the weight represented by the pressure data is greater than the weight of the subject.
  • training data can be collected from one or more of the pressure sensors 20, including pressure sensors 20A-20C, when the subject is at a first stage such as the completion of a walking event gait cycle, at the time of a right foot deceleration.
  • Training data can be collected from one or more of the pressure sensors 20, including pressure sensors 20A-20C, during a subsequent stage or stages of the walking event, such as at the time of the left foot heelstrike, flatfoot, midstance, pushoff, acceleration and midswing.
  • Sensor data from the sensors 18, including pressure data from the pressure sensors 20 may differ, for example in magnitude and timing sequence, for each of the different types of motion events. For example, running/jogging motion events and biking events may be distinguishable by different sensor data from pressure sensors 20.
  • pressure data from the pressure sensors 20 can be collected when the subject is instructed to perform each of one or more of the different types of body events.
  • FIGs. 11-19 illustrate a subject in a variety of different body events.
  • FIG. 11 shows the subject in a body position event such as a leaning event.
  • the pressure data collected by pressure sensors 20 on each of the two socks 16 may be different because the subject is leaning more heavily on one foot.
  • the pressure data collected by the pressure sensors 20 on each sock 16 may also be different because the subject may not be applying evenly distributed weight across either foot.
  • FIG. 12 shows the subject in a motion event such as dancing.
  • the dynamic pressure data collected by the pressure sensors 20 on each of the two socks will be different, and will likely vary in a somewhat cyclic time-varying manner.
  • the pressure data collected by the pressure sensors 20 on each sock 16 may vary in a somewhat cyclic time-varying manner.
  • FIG. 13, shows the subject in a standing, weight carrying event.
  • the pressure data collected by the pressure sensors 20 on each of the two socks may be generally the same since the subject may have its weight relatively evenly distributed across its two feet.
  • the pressure data collected by the pressure sensors 20 on each sock 16 may for similar reasons be distributed in a manner representative of an upright and standing weight distribution on each associated foot.
  • the weight carrying nature of the event can determined by the weight measured by the pressure sensors 20 as being greater than the known weight of the subject.
  • the forward-rearward weight distribution and/or side-to-side weight distribution with respect to centered or standing weight distribution, as determined from the pressure sensor data my indicate that the weight being carried by the subject is in front, behind, or to the side of the subject (e.g., forward leaning, rearward leaning or sideway leaning weight carrying events).
  • weight may not be relatively evenly distributed across its two feet. Garments such as socks and systems in accordance with this disclosure may be beneficial under these circumstances.
  • weight can be determined by summing information from all the sensors on each foot, and then summing the information from both feet to correlate to a predetermined starting weight.
  • Calibration may include the use of a known weight, and then standing on each foot independently.
  • the predetermined or known weight may be determined by the calibration.
  • Garments such as socks and systems in accordance with this disclosure can also be configured to determine the degree or amount to which the subject is carrying weight in front, behind, to the side, and/or close to their body. Similarly, whether the subject’s arms are extended may also be determined. As an example, the garment and system can determine a situation where a two hundred pound subject is carrying a twenty five pound weight with their arms fully extended. Per the description below, information of this type can be used to provide safety and/or risk or injury assessments associated with activities of the subject.
  • Safety events may also be identified from data collected by a subject in a standing, weight carrying event such as that shown and described above in connection with FIG. 13.
  • risk scores can be stored by the computer system 14.
  • the risk scores which may represent a range of severity or other materiality criterion related to safety, can be characterized by the amount of weight being carried (e.g., with respect to the weight and/or other characteristics such as age and gender of the subject), and/or the amount of leaning (e.g., in one or more of the forward, rearward or sideway directions) as determined by the distribution of weight on the subject’s foot or feet from the pressure sensors 20.
  • FIG. 14 shows the subject in a combination walking and leaning event, while walking with the help of a walking aid, for example, a walker or a cane.
  • a walking aid for example, a walker or a cane.
  • Characteristics of pressure data from pressure sensors 20 during a walking event such as those described above in combination with pressure data characteristics of a leaning event such as those described above can be used to identify the walking and leaning event.
  • the use of the aid, such as the walker, by the subject may be determined by the feature of the sum of the weight determined by the pressure sensor data being less than the known weight of the subject since the subject is partially supporting their weight on the walker.
  • a walking event with the subject using a cane can be determined in a similar manner.
  • the characteristics of the pressure sensor data representative of the walking event will be asymmetrical and cyclic since the subject may have the cane in only one hand, resulting in the use of the cane to support weight on the foot on the side of the subject with the cane.
  • FIG. 15 shows the subject in a squatting or crouching body position event.
  • Pressure sensors 20 may collect relatively static data representative of a somewhat uneven weight distribution between the two feet, with the weight asymmetrically distributed on each foot.
  • FIG. 16 shows the subject in a bicycle riding motion event.
  • Sensor data from pressure sensors 20 representative of the bike riding event is described above in connection with FIGs. 8A-8C.
  • the sensor data provided by pressure sensors 20 will be cyclic, and may be representative of higher pressures toward the front of the foot if the rider places the front of their foot on the bicycle pedals.
  • FIG. 17, shows the subject in a sitting body position event.
  • Sensor data from the pressure sensors 20 representative of the sitting event may be relatively static, and representative of weight less than the weight of the subject, and in some examples substantially less than the weight of the subject, since the subject is sitting, and the chair is supporting their weight.
  • FIG. 18, shows the subject in a walking or jogging/running motion event.
  • Sensor data from pressure sensors 20 representative of the walking and jogging/running events is described above in connection with FIGs. 6A-6C and 7A-7C.
  • the sensor data provided by pressure sensors 20 will be cyclic, and may be representative of back-to-front shifting pressures on each foot.
  • FIG, 19, for example, shows the subject in a standing body position event.
  • Sensor data from pressure sensors 20 representative of the standing event may be generally static, and relatively evenly distributed on each foot and between the two feet.
  • Calibration and/or training data may be collected from sensors 18 at process 102 when the sensors are at dynamic states.
  • a subject wearing the sock 16 can move through a range of positions corresponding to any one or more of the body events that the computer system 14 is configured to classify, and the sensor data collected from the sensors 18 through that range of positions during the motion, and can be used to characterize the body event.
  • Training data for any or all of the body events the computer system 14 is configured to identify and classify can be collected in the manners described above by process 102.
  • all or portions of the training data may be filtered.
  • Conventional or otherwise known filtering approaches can be performed at process 104.
  • outlying data can be removed, and/or smoothing functions can be applied (e.g., to dynamic state data).
  • Other non-limiting examples of filtering that may be performed at process 104 include noise removal and/or band pass filtering.
  • Process 104 is optional, and in embodiments only some of the training data, such as for example certain types, or none of the training data, is filtered at process 104.
  • all or portions of the training data may be contextualized or converted into context data.
  • Contextualizing data also known as context data
  • at process 106 may include converting all or portions of the training data into other forms that may be useful in connection with the method 100.
  • the training data may be converted to more human relatable values at process 106.
  • training data in the form of acceleration data, or acceleration data and gyroscope data may be converted into joint angles by approaches such as Kalman calculations.
  • Process 106 is optional, and in embodiments only some of the training data, such as for example certain types, or none of the training data, is contextualized at process 106.
  • the contextualized training data is evaluated to determine one or more features associated with the body event.
  • the magnitude of the pressure data from one or more of pressure sensors 20A-20C may be a body event feature for use with a classical or deep learning machine learning model.
  • the rate of change of the pressure data provided by pressure sensors 20A-20C of one sock 16, or between the pressure sensors of two socks may be a body event feature.
  • Process 108 yields variables likely to predict classifications of body events by the one or more machine learning models.
  • the one or more body event features indicate or define the type of the body event.
  • identified body event features associated with the pressure sensors 20 may represent any of the body events described above.
  • Identified body event features associated with other sensors 18, alone or in combination with those associated with the pressure sensors 20, for body events of these types may also be identified at the process 110.
  • one or more of the sets of training data are provided to train the machine learning model.
  • the training data was filtered by process 104 and/or contextualized by process 106
  • the filtered and/or contextualized training data may be provided to the machine learning model at process 108.
  • Labels or other information identifying the type of body event associated with the training data is also provided to the machine learning model at process 108.
  • the machine learning model may be a decision tree network, logistic regression classifier, an artificial neural network, a convolutional neural network, a recurrent neural network, a modular neural network, or any other suitable type of machine learning model.
  • the training data of each set may be analyzed by the machine learning model to determine one or more features associated with the body event and process 110.
  • a predicted body event is generated by the machine learning model based upon the event features identified at process 110. For example, in generating the predicted event, one or more parameters related to the one or more event features are calculated by the machine learning model (e.g., weight values associated with various layers of connections.).
  • the predicted body event generated at the process 112 may be one of the body events described above. Alternatively, an undetermined type of body event, or no body event, may be the predicted body event generated at the process 112.
  • the predicted body event is compared to the actual type or body event to determine an accuracy of the predicted body event.
  • the accuracy is determined by using a loss function or a cost function of the set of training data.
  • the one or more parameters related to the body event features are adjusted by the machine learning model.
  • the one or more parameters may be adjusted to reduce (e.g., minimize) the loss function or the cost function.
  • the process 118 determines that training of the machine learning model is not yet completed, then the method 100 returns to the process 108 in an iterative manner until the training is determined to be completed. In embodiments, if the process 118 determines that training of the machine learning model is completed, then the method 100 stops as indicated at process 120.
  • the trained machine learning model effectively possesses existing knowledge of which body event features are desirable or useful in terms of identifying and classifying body events.
  • each of one or more machine learning models may be trained for purposes of identifying one, or more than one but less than all, of the plurality of different types of body events.
  • machine learning models may be trained by the method 100 for purposes of identifying and classifying one, or more than one, type of body events.
  • FIG. 20 is a diagrammatic illustration of an exemplary method 300 for identifying and classifying, or determining body events, in accordance with embodiments.
  • the method 300 includes process 302 for providing the sensor data to one or more models, process 304 for processing the sensor data by the model, and process 306 for determining or identifying a body event by the one or more models.
  • the model is a statistical model.
  • one or more, or all of the models may be a machine learning model, such as for example an artificial neural network trained in accordance with the method 100 described above in connection with FIG. 10.
  • method 300 is described as using a selected group of processes, there can be many alternatives, modifications, and variations.
  • some of the processes may be expanded and/or combined. Other processes may be inserted. Depending upon the embodiment, the sequence of processes may be interchanged with others replaced.
  • some or all processes of the method 300 are performed by a computing device or a processor directed by instructions stored in memory. As an example, some or all processes of the method are performed according to instructions stored in a non-transitory computer- readable medium.
  • the sensor data from the smart garment 12 is provided to the model.
  • the sensor data are collected from the sensors 18, including the pressure sensors 20.
  • the sensor data includes at least sensor data from the pressure sensors 20A, 20B and 20C.
  • the sensor data is processed similar to processes 104, 106 and 108, in that process 304 filters, contextualizes and generates features from the sensor data.
  • the model determines or identifies and classifies body events based upon the processed sensor data. For example, in embodiments the model identifies or classifies the sensor data as being representative of at least one of the body events described herein.
  • body event features or other body event data associated with the different types of body events can be identified (e.g., by trained machine models or human labeling), and stored in association with the body events.
  • each of the different types of body events can be defined, characterized or represented by a set of one or more body event features that are unique to the body events.
  • Sensor data received from the sensors 18 on the smart garment 12, including sensor data from pressure sensors 20 such as pressure sensors 20A, 20B and 20C, may then be compared to the stored body event features to identify and classify the body events represented by the sensor data.
  • Feature extraction, pattern recognition and other processing methodologies may be used in connection with such approaches for identifying and classifying body events.
  • FIG. 21 is a block diagram illustrating physical components (e.g., hardware) of an exemplary computer system 14, in accordance with embodiments.
  • the computer system 14 may include at least one processing unit 802 and a system memory 804.
  • the system memory 804 may comprise, but is not limited to, volatile storage (e.g., random access memory), non-volatile storage (e.g., read-only memory), flash memory, or any combination of such memories.
  • the system memory 804 may include an operating system 805 and one or more program modules 806 such as a sensing and processing component 820.
  • the operating system 805 may be suitable for controlling the operation of the computer system 14.
  • embodiments of the disclosure may be practiced in conjunction with a graphics library, other operating systems, or any other application program and is not limited to any particular application or system.
  • This basic configuration is illustrated in FIG. 21 by those components within a dashed line 808.
  • the computer system 14 may have additional features or functionality.
  • the computer system 14 may also include additional data storage devices (removable and/or non-removable). Such additional storage is illustrated in FIG. 21 by a removable storage device 809 and a non-removable storage device 810.
  • program modules 806 may perform processes including, but not limited to, the aspects, as described herein, e.g., the processing of the sensor data from sensors 18 and the methods 100 and 300.
  • embodiments of the disclosure may be practiced in an electrical circuit comprising discrete electronic elements, packaged or integrated electronic chips containing logic gates, a circuit utilizing a microprocessor, or on a single chip containing electronic elements or microprocessors.
  • embodiments of the disclosure may be practiced via a system-on-a-chip (SOC) where each or many of the components illustrated in FIG. 21 may be integrated onto a single integrated circuit.
  • SOC system-on-a-chip
  • Such an SOC device may include one or more processing units, graphics units, communications units, system virtualization units and various application functionality all of which are integrated (or “burned”) onto the chip substrate as a single integrated circuit.
  • the functionality, described herein, with respect to the capability of client to switch protocols may be operated via application-specific logic integrated with other components of the computer system 14 on the single integrated circuit (chip).
  • Embodiments of the disclosure may also be practiced using other technologies capable of performing logical operations such as, for example, AND, OR, and NOT, including but not limited to mechanical, optical, fluidic, and quantum technologies.
  • embodiments of the disclosure may be practiced within a general purpose computer or in any other circuits or systems.
  • the computer system 14 may also have one or more input device(s) 812 such as visual image sensors, audio sensors, a sound or voice input device, a touch or swipe input device, etc.
  • the output device(s) 814 such as a display, speakers, etc. may also be included.
  • the aforementioned devices are examples and others may be used.
  • the computer system 14 may include one or more communication connections 816 allowing communications with other computing devices 850 (e.g., computing devices 128 and/or 130). Examples of suitable communication connections 816 include, but are not limited to, radio frequency (RF) transmitter, receiver, and/or transceiver circuitry; universal serial bus (USB), parallel, and/or serial ports.
  • RF radio frequency
  • USB universal serial bus
  • Computer readable media may include computer storage media.
  • Computer storage media may include volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, or program modules.
  • the system memory 804, the removable storage device 809, and the non-removable storage device 810 are all computer storage media examples (e.g., memory storage).
  • Computer storage media may include RAM, ROM, electrically erasable programmable read-only memory (EEPROM), flash memory or other memory technology, optical storage, magnetic storage devices, or any other article of manufacture which can be used to store information, and which can be accessed by the computer system 14. Any such computer storage media may be part of the computer system 14.
  • Computer storage media does not include a carrier wave or other propagated or modulated data signal.
  • Communication media may be embodied by computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave or other transport mechanism, and includes any information delivery media.
  • modulated data signal may describe a signal that has one or more characteristics set or changed in such a manner as to encode information in the signal.
  • communication media may include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, radio frequency (RF), infrared, and other wireless media.
  • RF radio frequency
  • Smart garments, systems, methods and other aspects of the technology disclosed herein can be used in connection with the systems and methods described in U.S. Provisional Application No. 63/539,182, filed on September 19, 2023 and entitled Physical Demands Characterization System and Methods (the ‘182 application”), which application is incorporated herein by reference in its entirety and for all purposes.
  • the smart garments, systems and methods of the present disclosure can be used in connection with the generation of an assessment of the physical requirements of a job and its task as disclosed in the ‘182 application.
  • the incorporation of information of the types described herein that can be generated from the smart garments, and inferences that can be draw from that information, may enhance the capabilities of the systems described in the ‘182 application.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Physics & Mathematics (AREA)
  • Veterinary Medicine (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Public Health (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Dentistry (AREA)
  • Physiology (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The present disclosure relates to systems and methods for identifying body events of a subject using footwear-type garments with pressure sensors. Examples of body events that can be identified include standing events, weight carrying events, motion events, body position events or safety events.

Description

SMART FOOTWEAR AND SYSTEM FOR IDENTIFYING BODY EVENTS
CROSS-REFERENCE TO RELATED APPLICATION
[0001 ] This application claims the benefit of Provisional Application No. 63/559,280, filed February 29, 2024, which is incorporated herein by reference in its entirety for all purposes.
FIELD
[0002] The present disclosure relates generally to garments with sensors and associated computer systems. More specifically, the disclosure relates to footwear and computer systems for identifying body events of users wearing the footwear.
BACKGROUND
[0003] Wearable electronics and smart garments or apparel are increasingly popular. These smart garments, which include sensors and other electronic components, can be used to collect a wide range of information about the user wearing the garment. Examples of such information include physiologic information, such as the pulse rate and oxygen saturation of a wearer. There remains, however, a continuing need for improved smart garments and associated systems for processing data collected by the smart garments. Garments and systems of these types that are capable of accurately identifying a wider range of information amount the wearer would be especially desirable.
SUMMARY
[0004] Smart garments and associated computer systems and methods in accordance with the disclosed examples may provide a number of advantages. For example, they are capable of efficiently and accurately providing useful insights into a wide range of activities and physiologic conditions of a subject.
[0005] One example is a method for operating a computing system including one or more processors to determine body events of a subject. Embodiments of this example may comprise: receiving, by the one or more processors, data from a plurality of sensors, including data from a plurality of pressure sensors mounted to at least one article positioned on a bottom of a foot of the subject, wherein the data includes first pressure data from a first pressure sensor at a front inside foot position, second pressure data from a second pressure sensor at a front outside foot position, and third pressure data from a third pressure sensor at a rear foot position; processing the data, by the one or more processors, to determine the body event of the subject, including: identifying one or more features associated with the body of the subject based upon the data from the plurality of sensors, including the plurality of pressure sensors mounted to the at least one article; and classifying the one or more features as a body event of the subject, wherein the body event is at least one of a standing event, a weight carrying event, a motion event, a body position event, and a safety event.
[0006] In embodiments of the example, identifying the one or more features may include one or more of: (i) determining a sum of two or more of the first pressure data, the second pressure data, and the third pressure data, for example to determine a total pressure of the foot of the subject; and/or (ii) determining a difference between two or more of the first pressure data, the second pressure data, and the third pressure data, for example to determine a distribution of pressure on the foot of the subject. Identifying the one or more features may also include one or more of: (i) determining i. and/or ii. above at one time, for example a static pressure determination; and/or (ii) determining i. and/or ii. above over a non-zero period of time, for example a dynamic or temporal changes pressure determination.
[0007] In any or all of the above embodiments of the example, the method may further comprise receiving, by the one or more processors, weight data representative of a weight of the subject; identifying the one or more features comprises determining a weight, optionally including summing the first, second and third pressure data; and classifying the one or more features comprises classifying the one or more features as a standing event, including optionally a leaning event when the determined weight is representative of a weight that is at least as great as the weight of the subject.
[0008] In any or all of the above embodiments of the example, the method may further comprise receiving, by the one or more processors, weight data representative of a weight of the subject, and identifying the one or more features comprises determining a weight, optionally including summing the first, second and third pressure data; and classifying the one or more features comprises classifying the one or more features as a weight carrying event when the determined weight is representative of a weight that is greater than the weight of the subject.
[0009] In any or all of the above embodiments of the example, receiving data from a plurality of sensors may comprise: receiving first article data from a plurality of pressure sensors mounted to an article positioned on a bottom of a first foot of the subject; and receiving second article data from a plurality of pressure sensors mounted to an article positioned on a bottom of a second foot of the subject; identifying the one or more features comprises identifying temporal changes between the first article data and the second article data; and classifying the one or more features comprises classifying the one or more features as a motion event based on the identified temporal changes between the first article data and the second article data. In embodiments, classifying the one or more features may comprise classifying the one or more features as a walking event, a jogging/running event, a biking event, or a dancing event.
[00010] For example, in embodiments, identifying the one or more features may comprise identifying one or more gait parameters, wherein the one or more gait parameters includes one or more of velocity or speed, stride length, stride duration, stance time/percentage, swing time/percentage, step time, step width, step asymmetry, or ground reaction force/pressure.
[00011] In any or all of the above embodiments of the example, identifying one or more features may comprise determining differences between two or more of the first pressure data, the second pressure data, and the third pressure data; and classifying the one or more features comprises classifying the one or more features as a body position event based on the determined differences between the two or more of the first pressure data, the second pressure data, and the third pressure data. In embodiments, classifying the one or more features as a body position event may comprise classifying the one or more features as a forward leaning position, a rearward leaning position, or a sideway leaning position.
[00012] In embodiments of the example, classifying the one or more features may comprise classifying the one or more features as a safety event.
[00013] In any or all of the above embodiments of the example, receiving the data may comprise receiving the data from a plurality of sensors mounted to at least one of a sock, a footwear insole, or a footwear article such as a shoe or boot. In embodiments, for example, receiving the data further comprises receiving data from one or more additional sensors, wherein the one or more additional sensors comprises one or more of an accelerometer, a PPG sensor, a light sensor, a heart sensor, a location sensor, a bioimpedance sensor, and EMG sensor, a GPS sensor, and environmental sensor, a bend sensor, a stretch sensor, mounted to the article. For example, identifying features and/or classifying features comprises identifying and/or classifying features based upon the data received from the one or more additional sensors.
[00014] In any or all of the above embodiments of the example, processing the data may comprise processing the data by a trained model. In embodiments, for example, processing the data comprises processing the data by a trained artificial neural network. Embodiments may further comprise training the trained model using body event calibration data.
[00015] Another example comprises one or more processors configured to perform the method of any or all of the exemplary embodiments described above.
[00016] Yet another example is one or more garments such as a sock, a footwear insole, or a footwear article such as a shoe or boot, configured to provide the sensor data of any or all of the exemplary embodiments described above.
BRIEF DESCRIPTION OF THE DRAWINGS
[00017] FIG. 1 is a diagrammatic illustration of a system including a footwear-type smart garment, shown for purposes of example as a sock, and a computer system, in accordance with embodiments.
[00018] FIG. 2 illustrates exemplary pressure data provided by pressure sensors during a walking event, in accordance with embodiments.
[00019] FIG. 3 illustrates exemplary pressure data provided by pressure sensors during a jogging/running event, in accordance with embodiments.
[00020] FIG. 4 illustrates exemplary pressure data provided by pressure sensors during a bicycle riding event, in accordance with embodiments.
[00021] FIG. 5 illustrates exemplary pressure data provided by pressure sensors during walking, jogging/running and bicycle riding events, in accordance with embodiments.
[00022] FIGs. 6A-6C illustrate exemplary acceleration and pressure data provided by sensors during a walking event, in accordance with embodiments.
[00023] FIGs. 7A-7C illustrate exemplary acceleration and pressure data provided by sensors during a jogging/running event, in accordance with embodiments.
[00024] FIGs. 8A-8C illustrate exemplary acceleration and pressure data provided by sensors during a bicycle riding event, in accordance with embodiments.
[00025] FIGs. 9A-9C illustrate exemplary acceleration and pressure data provided by sensors during steps of a walking event, in accordance with embodiments. [00026] FIG. 10 is a diagrammatic illustration of a method for training a model to identify body events, in accordance with embodiments.
[00027] FIG. 11 is a diagrammatic illustration of a subject during a standing or leaning body event, in accordance with embodiments.
[00028] FIG. 12 is a diagrammatic illustration of a subject during a dancing body event, in accordance with embodiments.
[00029] FIG. 13 is a diagrammatic illustration of a subject during a standing, weight carrying body event, in accordance with embodiments.
[00030] FIG. 14 is a diagrammatic illustration of a subject during a walking, leaning body event, in accordance with embodiments.
[00031] FIG. 15 is a diagrammatic illustration of a subject during a squatting or crouching body event, in accordance with embodiments.
[00032] FIG. 16 is a diagrammatic illustration of a subject during a bicycle riding body event, in accordance with embodiments.
[00033] FIG. 17 is a diagrammatic illustration of a subject during a sitting body event, in accordance with embodiments.
[00034] FIG. 18 is a diagrammatic illustration of a subject during a walking or jogging/running body event, in accordance with embodiments.
[00035] FIG. 19 is a diagrammatic illustration of a subject during an upright standing body event, in accordance with embodiments.
[00036] FIG. 20 is a diagrammatic illustration of a method for using a model to identify body events, in accordance with embodiments.
[00037] FIG. 21 is a detailed diagrammatic illustration of a computer system such as that shown in FIG. 1 , in accordance with embodiments.
DETAILED DESCRIPTION
[00038] The disclosures of all cited patent and non-patent literature are incorporated herein by reference in their entirety.
[00039] As used herein, the term "embodiment" or "disclosure" is not meant to be limiting, but applies generally to any of the embodiments defined in the claims or described herein. These terms are used interchangeably herein.
[00040] Unless otherwise disclosed, the terms "a" and "an" as used herein are intended to encompass one or more (i.e. , at least one) of a referenced feature. [00041] The features and advantages of the present disclosure will be more readily understood, by those of ordinary skill in the art from reading the following detailed description. It is to be appreciated that certain features of the disclosure, which are, for clarity, described above and below in the context of separate embodiments, may also be provided in combination in a single element. Conversely, various features of the disclosure that are, for brevity, described in the context of a single embodiment, may also be provided separately or in any sub-combination. In addition, references to the singular may also include the plural (for example, "a" and "an" may refer to one or more) unless the context specifically states otherwise.
[00042] The use of numerical values in the various ranges specified in this application, unless expressly indicated otherwise, are stated as approximations as though the minimum and maximum values within the stated ranges were both proceeded by the word "about". In this manner, slight variations above and below the stated ranges can be used to achieve substantially the same results as values within the ranges. Also, the disclosure of these ranges is intended as a continuous range including each and every value between the minimum and maximum values.
Description of Various Embodiments
[00043] Persons skilled in the art will readily appreciate that various aspects of the present disclosure can be realized by any number of methods and apparatuses configured to perform the intended functions. It should also be noted that the accompanying drawing figures referred to herein are not necessarily drawn to scale but may be exaggerated to illustrate various aspects of the present disclosure, and in that regard, the drawing figures should not be construed as limiting.
[00044] FIG. 1 is a diagrammatic illustration of a system 10 for identifying and classifying body events of a subject, in accordance with embodiments. As shown, system 10 includes a footwear-type smart garment 12 coupled to a computer system 14. Smart garment 12 includes an article, shown for example as a sock 16, that is configured to be worn or otherwise positioned adjacent to a foot on the body of a person or other subject (not shown in FIG. 1 ), and a plurality of sensors 18 mounted or otherwise coupled to the sock or foot of the subject. Although shown as a sock 16 for purposes of example in FIG. 1 , other embodiments of the smart garment 12 may include a shoe, boot, sandal or other footwear, or an insole or other member configured to be inserted into or otherwise attached to such footwear. As described in detail below, the sensors 18 generate sensor data associated with the subject wearing the sock 16. As shown in FIG. 1 , sensors 18 include a plurality of pressure sensors, such as for example pressure sensors 20A, 20B and 20C (collectively referred to as pressure sensors 20). In the illustrated embodiments, sensors 18, including the pressure sensors 20, are coupled to a data transfer device 22 via transmission channels 24.
T ransmission channels 24 may be wired or wireless channels. Data transfer device 22, which may be a wired or wireless data transmission device, transfers the sensor data from sensors 18, including the pressure data from the pressure sensors 20, to the computer system 14 via a network 26. In some embodiments, electrical conductors (not shown) can couple one or more of the plurality or sensors 18 to one or more others of the plurality of sensors and/or to the computer system 14. Although the data transfer device 22 is shown as a component of the sock 16 in FIG. 1 , in other embodiments the data transfer device can be mounted to the subject’s body at other locations (e.g., on other elements of clothing), incorporated into the associated sensors 18, and/or can be located off of the subject.
[00045] Computer system 14 processes the sensor data, including the pressure data from pressure sensors 20, to identify and classify certain physical and/or physiologic activity of the subject wearing the smart garment 12. Embodiments of the computer system 14 can determine certain body events of the subject. Examples of such body events include one or more of a standing event, a weight carrying event, a motion event, a body position event, or a safety event. Standing events may, for example, include leaning (e.g., forward, rearward of sideway) events. Motion events may, for example, include walking, jogging/running, bicycle riding or dancing events. Body position events may, for example, include sitting, leaning, crouching or squatting events. Safety events may include carrying excessive amounts of weight and/or possibly harmful body positions. Body events may also include several different body events (e.g., leaning while standing).
[00046] As described in greater detail below, in embodiments the body events can be determined by identifying features associated with the body of the subject based upon the data from sensors 18, including the pressure data from pressure sensors 20, and classifying those features as one or more different types of body events. In some embodiments the computer system 14 processes the sensor data using trained machine learning models to identify and/or classify the body events of the subject. In other embodiments, the computer system 14 compares the sensor data to stored body event data representative of the body events to identify and classify, or otherwise determine the body events. Sensor data provided by the smart garment 12 can also be used by the computer system 14 for certain set-up operations, such as to generate calibration and other data used to train the models and/or to generate the stored body event data used to determine the body events. Calibration data and trained models of these types effectively provide digital representations or models of the associated body events or other physical or physiological activities of the subject.
[00047] The sock 16 has portions located adjacent to a number of different portions or zones of the foot of the subject. Sensor 20A, for example, is located on the inside (e.g., right side of a left sock) and front portion of the sock 16 to provide pressure data on the inside of the front or ball of the subject’s foot. Sensor 20B is shown located on the outside (e.g., left side of a left sock) and front portion of the sock 16 to provide pressure data on the outside of the front or ball of the subject’s foot. Sensor 20C is shown located on the rear portion of the sock 16 to provide pressure data on the rear or heel of the subject’s foot. Embodiments of the sock 16 may include additional sensors 18. The embodiments shown in FIG. 1 , for example, include motion sensors 30A and 30B, a galvanic sensor 32, ambient condition sensor 34, electrocardiogram (ECG or EKG) heart sensor 36, photoplethysmography (PPG) sensor 38, impedance sensor 40 and a stretch sensor 42. Other embodiments include alternative or additional sensors, such as for example an implantable sensor, a light sensor, a location (e.g., GPS) sensor, an electromyography (EMG) sensor, and/or a bend sensor. The sensors 18 may be positioned on the inside of the sock 16 so as to contact the skin of the subject, on an outside of the sock, and/or embedded in the material or other substrate of the sock.
[00048] Pressure sensors 20, as noted above, are located to provide data representative of the pressure applied to different regions of the subject’s foot. Motion sensors 30A and 30B (collectively referred to as motion sensors 30) which may include Inertial Measurement Units (IMUs) that may have an accelerometer, a gyroscope and/or a magnetometer, may provide motion data or information such as acceleration data, x- axis, y-axis, and/or z-axis movement data and direction or heading data. In the illustrated embodiments, motion sensor 30A is positioned below the subject’s ankle, and motion sensor 30B is positioned above the ankle. The galvanic sensor 32 may provide data representative of skin conditions. Ambient condition sensor 34 may provide data representative of conditions such as temperature or humidity. Electrocardiogram sensor 36 may provide data regarding the electrical performance or characteristics of the subject’s heart. Photoplethysmography sensor 38 may provide data representative of physiologic characteristics such as blood, heart rate, blood oxygenation and blood pressure. Impedance sensor 40 may provide information representative of the bioimpedance of the subject’s skin of other tissue. Stretch sensor 42, which is shown for example adjacent the heel and lower back portions of the sock 16, may provide data representative of a change in length or bend angle. The sensors 18 may be commercially available or otherwise known devices.
[00049] Although only one article such as sock 16 (a left foot sock) is shown in FIG. 1 , other embodiments include a second article or smart garment such as 12, e.g., right foot sock (not shown), coupled to the computer system 14 by the network 26. Such a second sock or other article can for example be substantially the same as or similar to sock 16 and include sensors 20 such as a pressure sensors configured to provide data or information representative of pressures at various foot locations including the inside of the front or ball, the outside of the front or ball, and the rear or heel of the associated foot. Sensor data of information provided by such as second smart garment may enhance the efficiency and accuracy of body event determinations.
[00050] Although shown at particular locations on the sock 16 and corresponding locations on the body of the subject, sensors 18 are positioned at other locations of the sock or body of the subject in other embodiments. For example, the locations of the sensors 18 may be determined based on factors such as optimization of signal strength provided, relevance of sensor data to the events, such as body events desired to be identified and classified, and comfort and fit with respect to the sock 16. Although one sensor of many of the various different types, or more than one in the case of pressure sensors 20 and motion sensors 30, are shown for purposes of example, other embodiments include more or fewer sensors 18. By way of example, sensors 18 may be incorporated onto the sock 16 by approaches and structures such as pockets, adhesive, sewing and/or hook and loop fasteners. In embodiments, for example, the sensors 18 can be incorporated into a sensor harness such as that described in copending U.S. provisional application no. 63/442,886 filed on February 2, 2023, and entitled Electronics Harness for Smart Garments, or co-pending U.S. application no. 17/940,507, filed on September s, 2022, and entitled Garment including Electronic Devices, both of which are incorporated herein by reference in their entirety and for all purposes.
[00051] In embodiments, one or more, or all of the sensors 18 are wireless devices configured to communicate their associated sensor data to the data transfer device 22 on the sock 16 via the communication channels 24. As an example, embodiments of systems in accordance with this disclosure may include a 0.5 inch diameter FSR 400 series force sensor available from Interlink Electronics (e.g., powered by a 3.3v supply potential). A voltage divider circuit my be incorporated between a microprocessor and the sensor so an optimal voltage may be determined by an analog input of the microprocessor. Laminated pressure/force sensors that can be incorporated into systems of these types are generally known and commercially available. Devices of these types may, for example, change resistance based upon changes in shape and/or pressure. Additionally and alternatively, capacitive type force/pressure sensors, which are also known and commercially available, can be incorporated into systems in accordance with this disclosure.
[00052] Additionally, or alternatively, the data transfer device 22 may be in close proximity to the sensors 18, such as for example a mobile phone or device. In embodiments, one or more, or all of the sensors 18 are wireless devices configured to communicate the associated sensor data directly to the computer system 14 via the communication channels 24. In embodiments of smart garment 12, the data transfer device 22 may include an electronic component configured to be coupled to one or more, or all of the sensors 18, for example by a releasable connector plug (not shown). Such an electronic component may be configured to be coupled to the sensors 18 so as to facilitate electrical and mechanical connection of the electronic component and the disconnection of the electronic component from the sensors 18. The electronic component may for example include a wireless transmitter to transmit sensor data from the sensors 18 to the computer system 14 via the network 26. Alternatively, or additionally, the electronic component may include electronic memory that stores the sensor data from the sensors 18, for download or other transfer to the computer system 14. Exemplary connector plugs and electronic components of these types are disclosed, for example, in the above identified U.S. provisional application no. 63/442,886 that is incorporated herein.
[00053] Data transfer device 22 may also transfer data from the computer system 14 to one or more, or all of the sensors 18 in embodiments. In embodiments, the data transfer device 22 may include an electronics module comprising processor, battery, antenna and/or memory, and may be configured to provide all or portions of the processing functionality of the computer system 14 (e.g., to perform the methods 100 and 300 described below). The electronic components may be housed in an enclosure that is waterproof, and releasably attached to the sock 16 by, for example, one or more pockets, hook and loop patches, adhesive or fasteners. The above-identified U.S. provisional application no. 63/442,886 that is incorporated herein, for example, discloses structures and approaches, including pockets on bands, for releasably retaining data transfer devices such as 22 on the sock 16. Advantages of releasably retaining all or portions of the data transfer device 22 on the sock 16 include wash isolation and reuse of the sock.
[00054] Although sensors 18 are described above as being mounted to, located on or otherwise configured for use in connection with the sock 16, other embodiments of system 10 include sensors that are configured for use with other structures. For example, auxiliary sensors may be mounted to or located on a removable ankle band or leg band, or pants worn by or positioned on the subject (not shown).
[00055] FIG. 2 illustrates exemplary pressure data or information provided by pressure sensors such as 20A, 20B and 20C on a foot of a subject (e.g., at locations corresponding to front inside, front outside and heel) while the subject is walking. The data of FIG. 2, for example, may be representative of a normal walking pace over some uneven ground. FIG. 3 illustrates exemplary pressure data or information provided by pressure sensors such as 20A, 20B and 20C on a foot of a subject while the subject is running/jogging. The data of FIG. 3 may be representative of a normal or typical jogging pace on pavement, on level, uphill and downhill slopes. FIG. 4 illustrates exemplary pressure data or information provided by pressure sensors such as 20A, 20B and 20C on a foot of a subject while the subject is riding a bicycle. FIG. 5 illustrates exemplary pressure data or information provided by pressure sensors such as 20A, 20B and 20C on a foot of a subject while the subject is walking, running/jogging and riding a bicycle. As is evident from FIGs. 2-5, during each of the walking, running/jogging and bicycle riding events, the pressure data from sensors 20A, 20B and 20C is distinct and different from one another. As is also evident from FIGs. 2-5, the pressure data from the sensors 20 is also distinct and different across the walking, running/jogging and bicycle riding events (e.g., the pressure data during the running/jogging event and the bicycle riding event are different than the pressure data during the walking events.
[00056] FIGs. 6A-6C illustrate exemplary and corresponding ankle acceleration data from motion sensors such as 30A and/or 30B, and pressure data from pressure sensors 20A, 20B and/or 20C, during a walking event. FIG. 6A represents the ankle accelerations by axis (without particular orientation of the sensors). FIG. 6B represents the combination or average, such as sum of squares, of the ankle accelerations shown in FIG. 6A. FIG. 6C represents the corresponding pressure data from each of the sensors 20A, 20B and 20C. The data of FIGs. 6A-6C, for example, may be representative of normal walking pace over some uneven ground. Among other information, a clear signal for steps, and flat foot landings, are evident from the data shown in FIGs. 6A-6C.
[00057] FIGs. 7A-7C illustrate exemplary and corresponding ankle acceleration data from motion sensors such as 30A and/or 30B, and pressure data from pressure sensors 20A, 20B and/or 20C, during a running/jogging event. FIG. 7A represents the ankle accelerations by axis (without particular orientation of the sensors). FIG. 7B represents the combination or average, such as sum of squares, of the ankle accelerations shown in FIG. 7A. FIG. 7C represents the corresponding pressure data from each of the sensors 20A, 20B and 20C. The data of FIGs. 7A-7C, for example, may be representative of a normal or typical jogging pace on pavement, on level, uphill and downhill slopes. Among other information, a clear signal for steps is evident from the data shown in FIGs. 7A-7C.
[00058] FIGs. 8A-8C illustrate exemplary and corresponding ankle acceleration data from motion sensors such as 30A and/or 30B, and pressure data from pressure sensors 20A, 20B and/or 20C, during a biking event. FIG. 8A represents the ankle accelerations by axis (without particular orientation of the sensors). FIG. 8B represents the combination or average, such as sum of squares, of the ankle accelerations shown in FIG. 8A. FIG. 8C represents the corresponding pressure data from each of the sensors 20A, 20B and 20C. The data of FIGs. 8A-8C, for example, may be representative of a relatively slow pace or effort. Among other information, a clear signal for cadence is evident from the data shown in FIGs. 8A-8C.
[00059] FIGs. 9A-9C illustrate detailed exemplary and corresponding ankle acceleration data from motion sensors such as 30A and/or 30B, and pressure data from pressure sensors 20A, 20B and/or 20C, during several steps of a running/jogging event. FIG. 9A represents the ankle accelerations by axis (without particular orientation of the sensors). FIG. 9B represents the combination or average, such as sum of squares, of the ankle accelerations shown in FIG. 9A. FIG. 9C represents the corresponding pressure data from each of the sensors 20A, 20B and 200. Among other information, a clear signal for landings on the outside of the foot is evident from the data shown in FIGs. 9A-9C.
[00060] Certain gait or stride parameters/positions can be determined and used in connection with the system and method described herein. These gait parameters may include one or more of velocity, step counting, stride length, stride duration, stance time I percentage, swing time I percentage, step time, step width, step asymmetry, and/or ground reaction force I pressure. A subject’s stride can, for example, be characterized by a sequence of positions of one of their legs and feet during a gait cycle, where those positions include, in sequence, a heelstrike, footflat, midstance, pushoff, acceleration, midswing and deceleration. The gait cycle of that foot can be defined as the distance between the locations of the heel at two sequential heelstrike positions. Step length can be defined as the distance between the locations of the heels of the left and right feet at sequential heelstrike positions (e.g., approximately one- half of a gait cycle). Step angle of each foot can be defined as the angle of a midline through the subject’s foot (e.g., from toes to heel) with respect to a centerline of the path they are moving along. Step width for each of the feet can be defined as the distance between the centerline of the path, and a location on the midline of their foot (e.g., at the toes).
[00061] Embodiments of computer system 14 use one or more models to identify features of body events associated with sensor data provided by the sensors 18 of the subject wearing the smart garment 12, and to classify the identified features as representing one or more event types such as particular types of body events. In some examples, one or more of the models are machine learning models. In some examples, one or more of the models are artificial neural networks. Alternatively, or additionally, in some examples, one or more of the models are statistical models. The models may effectively be digital representations of the body events based upon sensor values that correspond to or characterize the body events. Sensor data received from the sensors 18 is applied to and/or compared to the models to identify and classify features as being associated with one or more body events. In embodiments, for example, computer system 14 includes models characterizing body events and associated body event features for (1) a standing event, (2) a weight carrying event, (3) a motion event, (4) a body position event, and/or (5) a safety event.
[00062] In embodiments, computer system 14 uses body event features in connection with the models to identify and classify the body events. Body event features may be portions of the sensor data from one or more of the sensors 18 that effectively define particular types or aspects of body events. For example, the magnitudes or intensities of the sensor data may define different body events. The relative timing of activity defined by the sensor data, either alone (e.g., from one sensor), or in combination with activity or lack of activity defined by other sensors, may be define different features of body events. In embodiments, one or more of the body event features may be defined by the activities of a plurality of the sensors 18, including for example pressure sensors 20.
[00063] FIG. 10 is a diagrammatic illustration of a method 100 for training a machine learning model to identify body events. The model uses body event features to classify and describe certain body events in accordance with embodiments. As shown, the method 100 includes process 102 for collecting sets of calibration or training data, process 104 to filter the sets of training data, process 106 for contextualizing the sets of training data, process 108 for analyzing the sets of training data to determine body event features, process 110 for providing the sets of training data and features to a machine learning model for training, process 112 for generating predicted body events, process 114 for comparing the predicted body events with the stored, actual, and/or labeled body event associated with the training data, process 116 for adjusting parameters of the machine learning model, and process 118 for determining whether training of the machine learning model has been completed. Although FIG. 10 illustrates a selected group of processes for the method 100, there can be many alternatives, modifications and variations. For example, some of the processes may be expanded and/or combined. Other processes may be inserted. Some of the processes may be optional and need not be performed. Depending upon the embodiment, the sequence of the processes may be interchanged with others replaced. For example, some or all of the processes 102, 104, 106, 108, 110, 112, 114, 116 and 118 are performed by a computing device or a processor directed by instructions stored in memory. As an example, some or all processes of the method 100 are performed according to instructions stored in a non-transitory computer-readable medium. It should be appreciated that while the process 108 and 110 are in the order and describe training classical supervised machine learning model types, such as KNN or decision trees, the order may be switched for an artificial neural network or deep learning approach. While process 108 determines body event features as variables in classical models, in the event of a convolutional neural network, the contextual data may be directly presented to the classifier for it to determine features such as hidden or pooling layers.
[00064] At the process 102, one or more sets of calibration or training data for each of the one or more body events is collected according to some embodiments. For example, each of the sets of training data includes sensor data relating to body event features of the associated body events, including data from pressure sensors 20, and knowledge of the types of those body events (e.g., the training data sets are labeled with the associated type of body event). The one or more sets of training data can be stored in a memory element as stored body event data for later comparison with a body event of a subject.
[00065] The calibration or training data collected at process 102 may be sensor data from the sensors 18 when the sensors are at one or more known states or body poses or positions corresponding to the body events. For example, the known states or body poses may be static states corresponding to predetermined positions and associated physiologic data of the body of the subject wearing one or more socks 16 during the body events. Alternatively, or additionally, the known states may be dynamic or time-changing states corresponding to movement of the body of the subject and associated physiologic data during the associated body events.
[00066] For example, a subject’s left and right foot at each of a plurality of gait or stride stages through a walking event can be identified. In particular, the right foot heelstrike, flatfoot, midstance pushoff, acceleration, midswing and deceleration stages are identified. Sensor data from sensors 18, including data from each of pressure sensors 20A, 20B and 20C from one or two socks 16 can be determined at these and other stages of the walking event (e.g., a type of motion event) for a subject, and used as training data.
[00067] Sensor data collected from the sensors 18 when the subject is at gait or stride positions such as those described above may be considered baseline position sensor data, and may be used for calibration or reference. In embodiments, training data may also be collected from the sensors 18 when the subject is in other static positions or poses, such as for example (1) when they are standing upright, and optionally not moving (e.g., a standing event), (2) when they are holding or otherwise carrying an item (a weight carrying event), (3) when they are jogging/running or riding a bicycle (other examples of motion events), (4) when they are leaning (e.g., at their waist), such as for example forward, backward or sideways, or sitting or lying (body position events), and/or (5) combinations of one or more of (1) - (4), such as for example a weight carrying event simultaneously with a body position event (safety events). At process 102, training data can also be collected from the sensors 18 when the subject is at a plurality of static positions or poses for each of the body events that the system 10 is configured to identify and classify. For example, for each body event, the calibration data can be collected from the sensors 18 when the subject is at two or more of a sequence or series of positions or poses they typically have during the associated body events (e.g., the body event is represented by a sequence of static poses corresponding to poses of an actual body event. As another example, gait or stride parameter characteristics such as those described above can be determined from one or more of the socks 16 of a subject, and used to identify features of the subject’s body events.
[00068] The training data collected during the process 102 may be used by the computer system 14 to characterize or “understand” the orientation of sensors 18 with respect to unique geometry of the subject or other subject providing the training or calibration information. When used for calibration purposes, for example, the training data may be used to adjust or compensate for particular or unique fits of the sock 16 on the subject. For applications where the absolute position of the subject is used, the subject may orientate during the process 102 in a known direction, such as for example north. Accuracy of the training data generated during step 102 may be enhanced, for example to compensate for drift in the sensors 18, if the subject periodically recalculates to the known orientation.
[00069] Training data collected during the process 102 may also include the weight of the subject. Certain body events, such as for example weight carrying events and safety events, can be determined when the sensors 18, including the pressure sensors 20, provide information used by the computer system 14 to determine that the weight represented by the pressure data is greater than the weight of the subject.
[00070] By way of an example in connection with the gait or stride positions described above, training data can be collected from one or more of the pressure sensors 20, including pressure sensors 20A-20C, when the subject is at a first stage such as the completion of a walking event gait cycle, at the time of a right foot deceleration. Training data can be collected from one or more of the pressure sensors 20, including pressure sensors 20A-20C, during a subsequent stage or stages of the walking event, such as at the time of the left foot heelstrike, flatfoot, midstance, pushoff, acceleration and midswing. Sensor data from the sensors 18, including pressure data from the pressure sensors 20, may differ, for example in magnitude and timing sequence, for each of the different types of motion events. For example, running/jogging motion events and biking events may be distinguishable by different sensor data from pressure sensors 20. As another example, pressure data from the pressure sensors 20 can be collected when the subject is instructed to perform each of one or more of the different types of body events.
[00071] FIGs. 11-19 illustrate a subject in a variety of different body events. FIG. 11 , for example, shows the subject in a body position event such as a leaning event. The pressure data collected by pressure sensors 20 on each of the two socks 16 may be different because the subject is leaning more heavily on one foot. The pressure data collected by the pressure sensors 20 on each sock 16 may also be different because the subject may not be applying evenly distributed weight across either foot.
[00072] FIG. 12, for example, shows the subject in a motion event such as dancing. The dynamic pressure data collected by the pressure sensors 20 on each of the two socks will be different, and will likely vary in a somewhat cyclic time-varying manner. Similarly, the pressure data collected by the pressure sensors 20 on each sock 16 may vary in a somewhat cyclic time-varying manner.
[00073] FIG. 13, for example, shows the subject in a standing, weight carrying event. The pressure data collected by the pressure sensors 20 on each of the two socks may be generally the same since the subject may have its weight relatively evenly distributed across its two feet. The pressure data collected by the pressure sensors 20 on each sock 16 may for similar reasons be distributed in a manner representative of an upright and standing weight distribution on each associated foot. The weight carrying nature of the event can determined by the weight measured by the pressure sensors 20 as being greater than the known weight of the subject. In some examples, the forward-rearward weight distribution and/or side-to-side weight distribution with respect to centered or standing weight distribution, as determined from the pressure sensor data, my indicate that the weight being carried by the subject is in front, behind, or to the side of the subject (e.g., forward leaning, rearward leaning or sideway leaning weight carrying events).
[00074] Additionally and alternatively, the subject’s weight may not be relatively evenly distributed across its two feet. Garments such as socks and systems in accordance with this disclosure may be beneficial under these circumstances. For example, weight can be determined by summing information from all the sensors on each foot, and then summing the information from both feet to correlate to a predetermined starting weight. Calibration may include the use of a known weight, and then standing on each foot independently. The predetermined or known weight may be determined by the calibration.
[00075] Garments such as socks and systems in accordance with this disclosure can also be configured to determine the degree or amount to which the subject is carrying weight in front, behind, to the side, and/or close to their body. Similarly, whether the subject’s arms are extended may also be determined. As an example, the garment and system can determine a situation where a two hundred pound subject is carrying a twenty five pound weight with their arms fully extended. Per the description below, information of this type can be used to provide safety and/or risk or injury assessments associated with activities of the subject.
[00076] Safety events may also be identified from data collected by a subject in a standing, weight carrying event such as that shown and described above in connection with FIG. 13. For example, risk scores can be stored by the computer system 14. The risk scores, which may represent a range of severity or other materiality criterion related to safety, can be characterized by the amount of weight being carried (e.g., with respect to the weight and/or other characteristics such as age and gender of the subject), and/or the amount of leaning (e.g., in one or more of the forward, rearward or sideway directions) as determined by the distribution of weight on the subject’s foot or feet from the pressure sensors 20.
[00077] FIG. 14, for example, shows the subject in a combination walking and leaning event, while walking with the help of a walking aid, for example, a walker or a cane. Characteristics of pressure data from pressure sensors 20 during a walking event such as those described above in combination with pressure data characteristics of a leaning event such as those described above can be used to identify the walking and leaning event. The use of the aid, such as the walker, by the subject may be determined by the feature of the sum of the weight determined by the pressure sensor data being less than the known weight of the subject since the subject is partially supporting their weight on the walker.
[00078] A walking event with the subject using a cane can be determined in a similar manner. When using a cane as an aid, the characteristics of the pressure sensor data representative of the walking event will be asymmetrical and cyclic since the subject may have the cane in only one hand, resulting in the use of the cane to support weight on the foot on the side of the subject with the cane.
[00079] FIG. 15, for example, shows the subject in a squatting or crouching body position event. Pressure sensors 20 may collect relatively static data representative of a somewhat uneven weight distribution between the two feet, with the weight asymmetrically distributed on each foot.
[00080] FIG. 16, for example, shows the subject in a bicycle riding motion event. Sensor data from pressure sensors 20 representative of the bike riding event is described above in connection with FIGs. 8A-8C. Among other characteristics, the sensor data provided by pressure sensors 20 will be cyclic, and may be representative of higher pressures toward the front of the foot if the rider places the front of their foot on the bicycle pedals.
[00081] FIG. 17, for example, shows the subject in a sitting body position event. Sensor data from the pressure sensors 20 representative of the sitting event may be relatively static, and representative of weight less than the weight of the subject, and in some examples substantially less than the weight of the subject, since the subject is sitting, and the chair is supporting their weight.
[00082] FIG. 18, for example, shows the subject in a walking or jogging/running motion event. Sensor data from pressure sensors 20 representative of the walking and jogging/running events is described above in connection with FIGs. 6A-6C and 7A-7C. Among other characteristics, the sensor data provided by pressure sensors 20 will be cyclic, and may be representative of back-to-front shifting pressures on each foot.
[00083] FIG, 19, for example, shows the subject in a standing body position event. Sensor data from pressure sensors 20 representative of the standing event may be generally static, and relatively evenly distributed on each foot and between the two feet.
[00084] Calibration and/or training data may be collected from sensors 18 at process 102 when the sensors are at dynamic states. For example, a subject wearing the sock 16 can move through a range of positions corresponding to any one or more of the body events that the computer system 14 is configured to classify, and the sensor data collected from the sensors 18 through that range of positions during the motion, and can be used to characterize the body event. Training data for any or all of the body events the computer system 14 is configured to identify and classify can be collected in the manners described above by process 102.
[00085] Referring back to FIG. 10, at the process 104, all or portions of the training data may be filtered. Conventional or otherwise known filtering approaches can be performed at process 104. As examples, outlying data can be removed, and/or smoothing functions can be applied (e.g., to dynamic state data). Other non-limiting examples of filtering that may be performed at process 104 include noise removal and/or band pass filtering. Process 104 is optional, and in embodiments only some of the training data, such as for example certain types, or none of the training data, is filtered at process 104.
[00086] At process 106, all or portions of the training data may be contextualized or converted into context data. Contextualizing data, also known as context data, at process 106 may include converting all or portions of the training data into other forms that may be useful in connection with the method 100. As an example, the training data may be converted to more human relatable values at process 106. As another example, training data in the form of acceleration data, or acceleration data and gyroscope data, may be converted into joint angles by approaches such as Kalman calculations. Process 106 is optional, and in embodiments only some of the training data, such as for example certain types, or none of the training data, is contextualized at process 106.
[00087] At the process 108, the contextualized training data is evaluated to determine one or more features associated with the body event. By way of example, the magnitude of the pressure data from one or more of pressure sensors 20A-20C may be a body event feature for use with a classical or deep learning machine learning model. In addition, the rate of change of the pressure data provided by pressure sensors 20A-20C of one sock 16, or between the pressure sensors of two socks, may be a body event feature. Process 108 yields variables likely to predict classifications of body events by the one or more machine learning models. According to certain embodiments, the one or more body event features indicate or define the type of the body event. For example, identified body event features associated with the pressure sensors 20 may represent any of the body events described above. Identified body event features associated with other sensors 18, alone or in combination with those associated with the pressure sensors 20, for body events of these types may also be identified at the process 110.
[00088] At the process 110, one or more of the sets of training data are provided to train the machine learning model. In examples where the training data was filtered by process 104 and/or contextualized by process 106, the filtered and/or contextualized training data may be provided to the machine learning model at process 108. Labels or other information identifying the type of body event associated with the training data is also provided to the machine learning model at process 108. As an example, the machine learning model may be a decision tree network, logistic regression classifier, an artificial neural network, a convolutional neural network, a recurrent neural network, a modular neural network, or any other suitable type of machine learning model. The training data of each set may be analyzed by the machine learning model to determine one or more features associated with the body event and process 110.
[00089] At the process 112, a predicted body event is generated by the machine learning model based upon the event features identified at process 110. For example, in generating the predicted event, one or more parameters related to the one or more event features are calculated by the machine learning model (e.g., weight values associated with various layers of connections.). In connection with the embodiments described above, the predicted body event generated at the process 112 may be one of the body events described above. Alternatively, an undetermined type of body event, or no body event, may be the predicted body event generated at the process 112.
[00090] At the process 114, the predicted body event is compared to the actual type or body event to determine an accuracy of the predicted body event. By some embodiments, the accuracy is determined by using a loss function or a cost function of the set of training data.
[00091] At the process 116, based upon the comparison, the one or more parameters related to the body event features are adjusted by the machine learning model. For example, the one or more parameters may be adjusted to reduce (e.g., minimize) the loss function or the cost function.
[00092] At the process 118, a determination is made on whether the training has been completed. For example, training for one set of training data may be completed when the loss function or the cost function for the set of training data is sufficiently reduced (e.g., minimized). As an example, training for the machine learning model is completed when training yields acceptable accuracy between predicted and known labels for one or more datasets. In embodiments, if the process 118 determines that training of the machine learning model is not yet completed, then the method 100 returns to the process 108 in an iterative manner until the training is determined to be completed. In embodiments, if the process 118 determines that training of the machine learning model is completed, then the method 100 stops as indicated at process 120. The trained machine learning model effectively possesses existing knowledge of which body event features are desirable or useful in terms of identifying and classifying body events.
[00093] Although the method 100 described above trained one machine learning model for purposes of identifying a plurality of different types of body events, in other embodiments each of one or more machine learning models may be trained for purposes of identifying one, or more than one but less than all, of the plurality of different types of body events. Stated differently, machine learning models may be trained by the method 100 for purposes of identifying and classifying one, or more than one, type of body events.
[00094] FIG. 20 is a diagrammatic illustration of an exemplary method 300 for identifying and classifying, or determining body events, in accordance with embodiments. As shown, the method 300 includes process 302 for providing the sensor data to one or more models, process 304 for processing the sensor data by the model, and process 306 for determining or identifying a body event by the one or more models. In some embodiments, the model is a statistical model. Alternatively, or additionally, one or more, or all of the models may be a machine learning model, such as for example an artificial neural network trained in accordance with the method 100 described above in connection with FIG. 10. Although method 300 is described as using a selected group of processes, there can be many alternatives, modifications, and variations. For example, some of the processes may be expanded and/or combined. Other processes may be inserted. Depending upon the embodiment, the sequence of processes may be interchanged with others replaced. For example, some or all processes of the method 300 are performed by a computing device or a processor directed by instructions stored in memory. As an example, some or all processes of the method are performed according to instructions stored in a non-transitory computer- readable medium.
[00095] At the process 302, the sensor data from the smart garment 12 is provided to the model. For example, the sensor data are collected from the sensors 18, including the pressure sensors 20. In some embodiments, the sensor data includes at least sensor data from the pressure sensors 20A, 20B and 20C.
[00096] At the process 304, the sensor data is processed similar to processes 104, 106 and 108, in that process 304 filters, contextualizes and generates features from the sensor data. [00097] At the process 306, the model determines or identifies and classifies body events based upon the processed sensor data. For example, in embodiments the model identifies or classifies the sensor data as being representative of at least one of the body events described herein.
[00098] Other embodiments use other approaches for identifying and classifying body events based upon the sensor data. For example, during calibration operations of the types described above, body event features or other body event data associated with the different types of body events can be identified (e.g., by trained machine models or human labeling), and stored in association with the body events. In effect, each of the different types of body events can be defined, characterized or represented by a set of one or more body event features that are unique to the body events. Sensor data received from the sensors 18 on the smart garment 12, including sensor data from pressure sensors 20 such as pressure sensors 20A, 20B and 20C, may then be compared to the stored body event features to identify and classify the body events represented by the sensor data. Feature extraction, pattern recognition and other processing methodologies may be used in connection with such approaches for identifying and classifying body events.
[00099] FIG. 21 is a block diagram illustrating physical components (e.g., hardware) of an exemplary computer system 14, in accordance with embodiments. In a basic configuration, the computer system 14 may include at least one processing unit 802 and a system memory 804. Depending on the configuration and type of computer system 14, the system memory 804 may comprise, but is not limited to, volatile storage (e.g., random access memory), non-volatile storage (e.g., read-only memory), flash memory, or any combination of such memories. The system memory 804 may include an operating system 805 and one or more program modules 806 such as a sensing and processing component 820.
[000100] The operating system 805, for example, may be suitable for controlling the operation of the computer system 14. Furthermore, embodiments of the disclosure may be practiced in conjunction with a graphics library, other operating systems, or any other application program and is not limited to any particular application or system. This basic configuration is illustrated in FIG. 21 by those components within a dashed line 808. The computer system 14 may have additional features or functionality. For example, the computer system 14 may also include additional data storage devices (removable and/or non-removable). Such additional storage is illustrated in FIG. 21 by a removable storage device 809 and a non-removable storage device 810.
[000101] As stated above, a number of program modules and data files may be stored in the system memory 804. While executing on the processing unit 802, the program modules 806 (e.g., the sensing and processing component 820) may perform processes including, but not limited to, the aspects, as described herein, e.g., the processing of the sensor data from sensors 18 and the methods 100 and 300.
[000102] Furthermore, embodiments of the disclosure may be practiced in an electrical circuit comprising discrete electronic elements, packaged or integrated electronic chips containing logic gates, a circuit utilizing a microprocessor, or on a single chip containing electronic elements or microprocessors. For example, embodiments of the disclosure may be practiced via a system-on-a-chip (SOC) where each or many of the components illustrated in FIG. 21 may be integrated onto a single integrated circuit. Such an SOC device may include one or more processing units, graphics units, communications units, system virtualization units and various application functionality all of which are integrated (or “burned”) onto the chip substrate as a single integrated circuit. When operating via an SOC, the functionality, described herein, with respect to the capability of client to switch protocols may be operated via application-specific logic integrated with other components of the computer system 14 on the single integrated circuit (chip). Embodiments of the disclosure may also be practiced using other technologies capable of performing logical operations such as, for example, AND, OR, and NOT, including but not limited to mechanical, optical, fluidic, and quantum technologies. In addition, embodiments of the disclosure may be practiced within a general purpose computer or in any other circuits or systems.
[000103] The computer system 14 may also have one or more input device(s) 812 such as visual image sensors, audio sensors, a sound or voice input device, a touch or swipe input device, etc. The output device(s) 814 such as a display, speakers, etc. may also be included. The aforementioned devices are examples and others may be used. The computer system 14 may include one or more communication connections 816 allowing communications with other computing devices 850 (e.g., computing devices 128 and/or 130). Examples of suitable communication connections 816 include, but are not limited to, radio frequency (RF) transmitter, receiver, and/or transceiver circuitry; universal serial bus (USB), parallel, and/or serial ports.
[000104] The term computer readable media as used herein may include computer storage media. Computer storage media may include volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, or program modules. The system memory 804, the removable storage device 809, and the non-removable storage device 810 are all computer storage media examples (e.g., memory storage). Computer storage media may include RAM, ROM, electrically erasable programmable read-only memory (EEPROM), flash memory or other memory technology, optical storage, magnetic storage devices, or any other article of manufacture which can be used to store information, and which can be accessed by the computer system 14. Any such computer storage media may be part of the computer system 14. Computer storage media does not include a carrier wave or other propagated or modulated data signal.
[000105] Communication media may be embodied by computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave or other transport mechanism, and includes any information delivery media. The term “modulated data signal” may describe a signal that has one or more characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media may include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, radio frequency (RF), infrared, and other wireless media.
[000106] Smart garments, systems, methods and other aspects of the technology disclosed herein can be used in connection with the systems and methods described in U.S. Provisional Application No. 63/539,182, filed on September 19, 2023 and entitled Physical Demands Characterization System and Methods (the ‘182 application”), which application is incorporated herein by reference in its entirety and for all purposes. As one example, the smart garments, systems and methods of the present disclosure can be used in connection with the generation of an assessment of the physical requirements of a job and its task as disclosed in the ‘182 application. The incorporation of information of the types described herein that can be generated from the smart garments, and inferences that can be draw from that information, may enhance the capabilities of the systems described in the ‘182 application. Additionally and alternatively, a combination of features described in the ‘182 application with the garments, systems and methods of this disclosure may enhance the capabilities of the technology described in this disclosure. [000107] The invention of this application has been described above both generically and with regard to specific embodiments. It will be apparent to those skilled in the art that various modifications and variations can be made in the embodiments without departing from the scope of the disclosure. Thus, it is intended that the embodiments cover the modifications and variations of this invention provided they come within the scope of the appended claims and their equivalents.

Claims

1 . A method for operating a computing system including one or more processors to determine body events of a subject, comprising: receiving, by the one or more processors, data from a plurality of sensors, including data from a plurality of pressure sensors mounted to at least one article positioned on a bottom of a foot of the subject, wherein the data includes first pressure data from a first pressure sensor at a front inside foot position, second pressure data from a second pressure sensor at a front outside foot position, and third pressure data from a third pressure sensor at a rear foot position; processing the data, by the one or more processors, to determine the body event of the subject, including: identifying one or more features associated with the body of the subject based upon the data from the plurality of sensors, including the plurality of pressure sensors mounted to the at least one article; and classifying the one or more features as a body event of the subject, wherein the body event is at least one of a standing event, a weight carrying event, a motion event, a body position event, and a safety event.
2. The method of claim 1 , wherein identifying the one or more features includes one or more of:
(i) determining a sum of two or more of the first pressure data, the second pressure data, and the third pressure data, for example to determine a total pressure of the foot of the subject; and/or
(ii) determining a difference between two or more of the first pressure data, the second pressure data, and the third pressure data, for example to determine a distribution of pressure on the foot of the subject.
3. The method of claim 2, wherein identifying the one or more features includes one or more of: i. determining i. and/or ii. above at one time, for example a static pressure determination; and/or ii. determining i. and/or ii. above over a non-zero period of time, for example a dynamic or temporal changes pressure determination.
4. The method of any of claims 1-3, wherein: the method further comprises receiving, by the one or more processors, weight data representative of a weight of the subject; identifying the one or more features comprises determining a weight, optionally including summing the first, second and third pressure data; and classifying the one or more features comprises classifying the one or more features as a standing event, including optionally a leaning event when the determined weight is representative of a weight that is at least as great as the weight of the subject.
5. The method of any of claims 1-4, wherein: the method further comprises receiving, by the one or more processors, weight data representative of a weight of the subject, and identifying the one or more features comprises determining a weight, optionally including summing the first, second and third pressure data; and classifying the one or more features comprises classifying the one or more features as a weight carrying event when the determined weight is representative of a weight that is greater than the weight of the subject.
6. The method of any of claims 1-5, wherein: receiving data from a plurality of sensors comprises: receiving first article data from a plurality of pressure sensors mounted to an article positioned on a bottom of a first foot of the subject; and receiving second article data from a plurality of pressure sensors mounted to an article positioned on a bottom of a second foot of the subject; identifying the one or more features comprises identifying temporal changes between the first article data and the second article data; and classifying the one or more features comprises classifying the one or more features as a motion event based on the identified temporal changes between the first article data and the second article data.
7. The method of claim 6, wherein classifying the one or more features comprises classifying the one or more features as a walking event, a jogging/running event, a biking event, or a dancing event.
8. The method of any of claims 6-7, wherein identifying the one or more features comprises identifying one or more gait parameters, wherein the one or more gait parameters includes one or more of velocity or speed, stride length, stride duration, stance time/percentage, swing time/percentage, step time, step width, step asymmetry, or ground reaction force/pressure.
9. The method of any of claims 1-8, wherein: identifying one or more features comprises determining differences between two or more of the first pressure data, the second pressure data, and the third pressure data; and classifying the one or more features comprises classifying the one or more features as a body position event based on the determined differences between the two or more of the first pressure data, the second pressure data, and the third pressure data.
10. The method of claim 9, wherein classifying the one or more features as a body position event comprises classifying the one or more features as a forward leaning position, a rearward leaning position, or a sideway leaning position.
11 . The method of claim 1 , wherein classifying the one or more features comprises classifying the one or more features as a safety event.
12. The method of any of claims 1-11 , wherein receiving the data comprises receiving the data from a plurality of sensors mounted to at least one of a sock, a footwear insole, or a footwear article such as a shoe or boot.
13. The method of claim 12 herein receiving the data further comprises receiving data from one or more additional sensors, wherein the one or more additional sensors comprises one or more of an accelerometer, a PPG sensor, a light sensor, a heart sensor, a location sensor, a bioimpedance sensor, and EMG sensor, a GPS sensor, and environmental sensor, a bend sensor, a stretch sensor, mounted to the article.
14. The method of claim 13, wherein identifying features and/or classifying features comprises identifying and/or classifying features based upon the data received from the one or more additional sensors.
15. The method of any of claims 1-14, wherein processing the data comprises processing the data by a trained model.
16. The method of claim 15, wherein processing the data comprises processing the data by a trained artificial neural network.
17. The method of any of claims 15-16, further comprising training the trained model using body event calibration data.
18. A computer system comprising one or more processors configure to perform the method of any of claims 1-17.
19. One or more garments such as a sock, a footwear insole, or a footwear article such as a shoe or boot, configured to provide the sensor data of any of claims 1- 14.
PCT/US2025/015687 2024-02-29 2025-02-13 Smart footwear and system for identifying body events Pending WO2025183907A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202463559280P 2024-02-29 2024-02-29
US63/559,280 2024-02-29

Publications (1)

Publication Number Publication Date
WO2025183907A1 true WO2025183907A1 (en) 2025-09-04

Family

ID=94968924

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2025/015687 Pending WO2025183907A1 (en) 2024-02-29 2025-02-13 Smart footwear and system for identifying body events

Country Status (1)

Country Link
WO (1) WO2025183907A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110054359A1 (en) * 2009-02-20 2011-03-03 The Regents of the University of Colorado , a body corporate Footwear-based body weight monitor and postural allocation, physical activity classification, and energy expenditure calculator
US10194837B2 (en) * 2015-05-18 2019-02-05 Vayu Technology Corp. Devices for measuring human gait and related methods of use
US20200100927A1 (en) * 2017-03-22 2020-04-02 Gryppers, Inc. Grip Enhancement and Protection for the Feet
US20200367823A1 (en) * 2018-01-05 2020-11-26 Myant Inc. Multi-functional tubular worn garment
US20220087364A1 (en) * 2020-09-24 2022-03-24 SportScientia Pte. Ltd. Insole layer for monitoring human lower limb and foot performance

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110054359A1 (en) * 2009-02-20 2011-03-03 The Regents of the University of Colorado , a body corporate Footwear-based body weight monitor and postural allocation, physical activity classification, and energy expenditure calculator
US10194837B2 (en) * 2015-05-18 2019-02-05 Vayu Technology Corp. Devices for measuring human gait and related methods of use
US20200100927A1 (en) * 2017-03-22 2020-04-02 Gryppers, Inc. Grip Enhancement and Protection for the Feet
US20200367823A1 (en) * 2018-01-05 2020-11-26 Myant Inc. Multi-functional tubular worn garment
US20220087364A1 (en) * 2020-09-24 2022-03-24 SportScientia Pte. Ltd. Insole layer for monitoring human lower limb and foot performance

Similar Documents

Publication Publication Date Title
US20240255364A1 (en) Apparel Having Sensor System
US20240268710A1 (en) Systems and methods for assessing gait, stability, and/or balance of a user
KR101787221B1 (en) System and method for analyzing athletic activity
Nguyen et al. Classification of five ambulatory activities regarding stair and incline walking using smart shoes
CN107048569B (en) Intelligent shoe and sports information acquisition system and method
KR102302719B1 (en) Apparatus and method for classification of gait type by performing neural network analysis for various detection information
CN114727685A (en) Method and system for calculating personalized sole parameter values for customized sole designs
CN107890166A (en) A kind of Intelligent insole based on plantar pressure gesture recognition
KR102194313B1 (en) Apparatus and method for identifying individuals by performing neural network analysis for various detection information
Awais et al. Physical activity classification using body-worn inertial sensors in a multi-sensor setup
US12064012B2 (en) Multi-modal sensor fusion platform
WO2025183907A1 (en) Smart footwear and system for identifying body events
KR102280291B1 (en) Apparatus and method for identify patients with parkinson's disease and patients with podarthritis by performing neural network analysis by various detection information
KR102581188B1 (en) Gait analysis method and gait analysis system thereof
Shayan et al. ShrewdShoe, a smart pressure sensitive wearable platform
Selvan et al. Smart Shoes for Fitness and Performance Analysis of Sportsmen
Eizentals et al. Gait partitioning with smart socks system
EP4652927A1 (en) Method to establish determination model and method to determine foot accessory
Anuar et al. The development of wearable sensor for arcus pedis pressure assessment
Sangkaphet et al. Optimization of novel feature extraction for foot strike pattern recognition
Wang et al. A New Hidden Markov Model Algorithm to Detect Human Gait Phase Based on Information Fusion Combining Inertial with Plantar Pressure.
US20250339036A1 (en) Blood pressure evaluation with machine learning
Hartiningrum et al. A Detection System for Center of Pressure Change and Lower Extremity Kinematics During Pregnancy for Welfare Design Recommendation of Pregnant Women
Chowdhary et al. Classification of walkers based on back angle measurements using wireless sensor node
TWI667628B (en) Foot detection device, foot detection system and foot management interaction system System

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 25711286

Country of ref document: EP

Kind code of ref document: A1