[go: up one dir, main page]

WO2025160251A1 - Guiding a physical orientation of a user during sports play - Google Patents

Guiding a physical orientation of a user during sports play

Info

Publication number
WO2025160251A1
WO2025160251A1 PCT/US2025/012712 US2025012712W WO2025160251A1 WO 2025160251 A1 WO2025160251 A1 WO 2025160251A1 US 2025012712 W US2025012712 W US 2025012712W WO 2025160251 A1 WO2025160251 A1 WO 2025160251A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
orientation
physical orientation
body portion
physical
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/US2025/012712
Other languages
French (fr)
Inventor
Tyler Michael KENNEY
Zachary Ryan Smith
Robert Dennis LEONARD
Philip Judson BROCK
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Laced LLC
Original Assignee
Laced LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Laced LLC filed Critical Laced LLC
Publication of WO2025160251A1 publication Critical patent/WO2025160251A1/en
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B19/00Teaching not covered by other main groups of this subclass
    • G09B19/003Repetitive work cycles; Sequence of movements
    • G09B19/0038Sports
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2503/00Evaluating a particular growth phase or type of persons or animals
    • A61B2503/10Athletes

Definitions

  • This disclosure relates to systems for assisting physical alignment and orientation in sports that involve precision targeting of projectiles, such as, but not limited to, golf, baseball, basketball, football, or archery. Specifically, this disclosure pertains to real-time feedback systems using wearable sensors to guide users in achieving alignment with a predetermined target.
  • the techniques described herein relate to a physical orientation system for guiding a user for aiming a sports projectile at a destination, the system including: a sensor module configured to detect a physical orientation of at least one body portion of a user; a processor operatively coupled to the sensor module; and a memory operatively coupled to the processor, the memory configured to store instructions that, when executed by the processor, cause the processor to: receive a sensor signal indicative of a reference orientation of the user relative to a target destination; receive one or more additional sensor signals indicative of an actual orientation of the user in space, relative to the reference orientation; compare the actual orientation to the reference orientation to identify an angular offset of the actual orientation relative to the reference orientation; and when the angular offset is within a predefined range, output a positive alignment indication, and when the angular offset is outside of the predefined range, output a negative alignment indication.
  • the techniques described herein relate to a computer-implemented method for guiding physical orientation of a user for aiming a sports projectile at a destination, including: receiving a sensor signal indicative of a reference orientation of the user relative to a target destination; receiving one or more additional sensor signals indicative of an actual orientation of the user in space, relative to the reference orientation; comparing the actual orientation to the reference orientation to identify an angular offset of the actual orientation relative to the reference orientation; and when the angular offset is within a predefined range, outputting a positive alignment indication, and when the angular offset is outside of the predefined range, outputting a negative alignment indication.
  • the techniques described herein relate to a non-transitory computer readable medium configured to store computer readable instructions that, when read by a processor, cause the processor to execute operations for guiding physical orientation of a user for aiming a sports projectile at a destination, the operations including: receiving a sensor signal indicative of a reference orientation of the user relative to a target destination; receiving one or more additional sensor signals indicative of an actual orientation of the user in space, relative to the reference orientation; comparing the actual orientation to the reference orientation to identify an angular offset of the actual orientation relative to the reference orientation; and when the angular offset is within a predefined range, outputting a positive alignment indication, and when the angular offset is outside of the predefined range, outputting a negative alignment indication.
  • FIG. 1 shows a schematic of an embodiment of a system for determining a physical orientation of a user during sports play.
  • FIG. 2 shows a schematic of an embodiment of a system for determining a physical orientation of a user during sports play.
  • FIG. 3 A shows a schematic of an embodiment of a device system for determining a physical orientation of a user during sports play.
  • FIG. 3B shows a schematic of orientation axes relative to a body portion.
  • FIG. 4A shows a schematic of an embodiment of at least a portion of a method of using a device for physical orientation of a user.
  • FIG. 4B shows a schematic of an embodiment of at least a portion of a method of using a device for physical orientation of a user.
  • FIG. 4C shows a schematic of an embodiment of at least a portion of a method of using a device for physical orientation of a user.
  • FIG. 5 A shows a schematic of an embodiment of at least a portion of a method of using one or more devices for physical orientation of a user.
  • FIG. 5C shows a schematic of an embodiment of at least a portion of a method of using one or more devices for physical orientation of a user.
  • FIG. 5D shows a schematic of an embodiment of at least a portion of a method of using one or more devices for physical orientation of a user.
  • FIG. 6 shows an embodiment of a user interface for displaying a location of one or more sensor modules.
  • FIG. 7 shows an embodiment of a user interface for displaying various user statistics.
  • FIG. 8 shows an embodiment of a user interface for displaying a reference orientation of the user.
  • FIG. 9 shows an embodiment of a user interface for displaying an actual orientation of the user.
  • FIG. 10 shows an embodiment of a user interface for displaying or receiving an output of a projective launch.
  • FIG. 11 shows a flow diagram of a physical orientation method for aiming a sports projectile at a destination.
  • Orientation set up in athletic performance is fundamental to an athlete’s success, irrespective of the sport. Some factors influencing orientation include identification of the intended target and the athlete's body alignment relative to the intended target. These factors play a role in launching a sports projectile toward a destination. For instance, each athlete’s eyesight and dominant eye significantly alters their line of sight, making each aiming opportunity a challenge to execute physical movements effectively. A general, one-size-fits- all solution fails to address these nuances.
  • the devices and methods described herein solve the above technical problems by using training methods that integrate individualized visual and physical elements to improve aim to a target, acknowledging the individuality of each athlete's physical and perceptual characteristics.
  • waistband-based orientation systems use a single sensor placed in a waistband to measure rotational orientation. These systems provide angular offset feedback parallel to the target using haptic cues.
  • this approach has significant limitations.
  • these waistband-based orientation systems suffer from axis variability, meaning as the user bends or moves, the axis of the sensor changes, leading to inaccurate rotational readings.
  • conventional systems have limited target approach. In other words, evaluating angular offset parallel to the target does not account for the holistic body alignment that is used for sports like golf.
  • Static alignment tools like alignment sticks and other manual aids, require setup, provide no real-time feedback, are prone to movement during use, and are dependent on a user’s visual perception.
  • Conventional alignment analysis systems lack the ability to calculate a composite orientation, leading to less accurate guidance for users with complex body movements.
  • the devices and systems described herein solve the above technical problems with technical solutions.
  • the devices and systems described herein provide for improved aiming of sports projectile(s) at a target or a destination.
  • the devices and methods described herein offer real-time physical orientation feedback using one or more sensor modules (e.g., an orientation sensor, inertial measurement unit, gyroscope, positional sensor, angular sensor, etc.).
  • the one or more sensor modules may be wearable on one or both feet of a user.
  • the devices and methods may generate an angular offset relative to a target or destination based on measurements from one or more sensor modules.
  • the devices and methods may generate a substantially perpendicular angular offset relative to a target or destination based on measurements from one or more orientation sensor modules on the one or both feet of a user.
  • the angular offset may be individualized such that the offset is non-perpendicular.
  • the angular offset may be individualized based on user preference and/or user history.
  • using an angular offset based on one or more sensor modules coupled, for example, to one or both feet of a user may improve alignment precision by reducing inaccuracies caused by body movements, such as bending (e.g., from a sensor placed on a torso of a user), that can affect rotational accuracy.
  • one or more additional sensors may be used for calibration or confidence level determinations.
  • additional sensors may be in a similar location as the orientation determination sensor(s) (e.g., foot, leg, ankle, etc.) or at a separate location from the orientation determination sensor(s) (e.g., torso, leg, arm, hip, etc.).
  • the device and methods described herein may also employ two or more, or a plurality of sensors to calculate or determine a composite orientation.
  • a composite orientation may simplify user feedback, instead of, for example, providing feedback to a user for each sensor of two or more or a plurality of sensors, which in some instances, may confuse a user.
  • users may receive real-time feedback in the form of visual, haptic, or auditory signals, which may guide the user toward achieving alignment with the target.
  • the system may store performance data (e.g., user or individualized performance data or performance data related to desired movement or alignment patterns) or receive performance data (e.g., using an application programming interface (API)), enabling trend analysis, customized training plans, and/or individualized alignment (e.g., angular offset individualized for the user).
  • performance data e.g., user or individualized performance data or performance data related to desired movement or alignment patterns
  • receive performance data e.g., using an application programming interface (API)
  • API application programming interface
  • the devices and methods described herein solve the technical problems experienced with conventional systems, such as errors in rotational data caused by mobile or highly mobile body portions (e.g., torso) during a sport process, by utilizing a dynamic, portable, sensorized device or system coupled to, for example, one or both feet, one or both legs, or the like.
  • the devices and methods described herein may be positioned on one or both feet of a user, such that the impact of swinging, bending, or other torsional movements of the user during setup or game play may have a reduced effect on reference orientation determination, actual orientation determination, and/or angular offset determination.
  • the devices and methods described herein may determine a reference orientation, actual orientation, and/or angular offset of the body portion, for example one or both feet, relative to a ground plane using a single axis (i.e., about a yaw axis as shown in FIG. 3B).
  • the devices and methods described herein may additionally determine a reference orientation, actual orientation, and/or angular offset of the body portion about two or more axes, for example one or both of a pitch axis or tilt axis, as shown in FIG. 3B, to account for a slope or unevenness of a terrain on which the user is positioned (or standing).
  • target destination may be a location on a fairway, a flag, a cup, a location on a green, etc.
  • baseball e.g., target destination may be a left field location, a plate, a right field location, a center field location, a glove, etc.
  • tennis e.g., target destination may be a left baseline, a right baseline, a middle centerline, etc.
  • pickleball etc.
  • FIG. 1 shows a schematic of an embodiment of an example system 100 for determining a physical orientation of a user during sports play.
  • a system 100 may include a physical orientation device 110, a sensor module 120, a processor 130, a memory 140, and optionally an input device 134.
  • the sensor module 120 may at least partially within the physical orientation device 110, or the sensor module 120 may be communicatively or electrically (e.g., Bluetooth, Wi-Fi, wireless, near-field, etc.) coupled to the physical orientation device 110.
  • the physical orientation device 110 may be operatively or communicatively coupled (e.g., using a wired or wireless connection) to the processor 130 and the memory 140.
  • the processor 130 and the memory 140 are at least partially within the physical orientation device 110.
  • the processor 130 and the memory 140 may be at least partially within a computing device communicatively coupled to the physical orientation device 110.
  • the physical orientation device 110 may be removably coupled to a body portion or at least a portion of a body of a user.
  • GUI 600 The body portion, to which the physical orientation device 110 may be coupled, can be displayed on a graphical user interface (GUI) 600.
  • GUI 600 shows two example locations (of a plurality of locations) for placement of a sensor module 610 on the shoes and/or a sensor module 620 on the socks.
  • a status of each device may also be displayed on the GUI 600.
  • a first sensor module 612 and a second sensor module 614 may be shown with a status, for example “ready”, “paired”, “unpaired”, “not ready”, “not detected”, “detected”, “connected”, “not connected”, etc.
  • the GUI may be grayed out or dimmed and the first sensor module 616 and the second sensor module 618 may not show a status or may show a status indicating that these sensor locations are not in use.
  • a physical orientation device 110 may be removably coupled to, or integrated into, a sock, a footwear, an insole, a lace of a footwear, a tongue of a footwear, a knee brace, a portion of a legwear (e.g., shorts, pant leg, ), a belt loop, a portion of a belt, a pocket, a headwear (e.g., hat, visor, headband, helmet, etc.), a collar, a sleeve of a clothing item, a portion of a face worn item (e.g., hearing aid, glasses, sunglasses, etc.), a neckband or necklace, an ear worn device (e.g., an earpiece, earbud), a portion of a chest strap (like heart rate monitors), a portion of shoulder harness,
  • an ear worn device e.g., an earpiece, earbud
  • a portion of a chest strap like heart rate monitors
  • a physical orientation device 110 may be removably coupled to a foot of a user, for example, directly to the foot or indirectly using a shoe, bracelet, sock, or other item coupled to the foot of the user.
  • a first physical orientation device 110 may be removably coupled to a first foot of a user and a second physical orientation device 110 may be removably coupled to a second foot of a user.
  • either or both can be directly coupled to the foot or indirectly coupled to the foot using a shoe, bracelet, sock, or other item coupled to the foot of the user.
  • the physical orientation device 110 may be removably couplable to a portion of a back of a user (e.g., between shoulder blades, upper back, etc.), a portion of a sternum of a user, a foot, an ankle, a lower leg portion, a portion of a hip, a portion of a thigh, a portion of a torso, etc.
  • the physical orientation device 110 may be integrated into an adhesive skin patch.
  • a position of a physical orientation device 110 on a user may be dependent on a use case (i.e., what needs to be measured or determined) of the physical orientation device 110. If the physical orientation device 110 is measuring an orientation of a body portion in space, the physical orientation device 110 may be coupled to a foot, a leg, a shoulder, an elbow, etc. of a user. If a first physical orientation device 110a is measuring a confidence level or calibrating a second physical orientation device 110b, the first physical orientation device 110a may be coupled to a same location or a different location as the second physical orientation device 110b (e.g., a foot, a leg, a shoulder, an elbow, a torso, a chest, etc.).
  • the physical orientation device 110 may include a magnetic attachment.
  • the physical orientation device 110 may include a hook-and-loop mounting system.
  • the physical orientation device 110 may include a clip-on mechanism.
  • the physical orientation device 110 may include an attachment point or multiple attachment points.
  • the physical orientation device 110 may be a wearable device, a virtual reality device, a wearable device operatively coupled to a computing device, a wearable device operatively coupled to a server, or a nonwearable device (e.g., camera or image sensor, rangefinder, etc.) manipulated by a user.
  • a nonwearable device e.g., camera or image sensor, rangefinder, etc.
  • a physical orientation device 110 is shown, one of skill in the art will appreciate that any number of physical orientation devices 110 may be used. For example, a user may don a physical orientation device 110, two physical orientation devices 110, three physical orientation devices 110, four physical orientation devices 110, one to five physical orientation devices 110, five to ten physical orientation devices 110, etc.
  • a first physical orientation device may be donned by a user and a second physical orientation device may be used by, held by, or otherwise manipulated by a user but not donned by the user.
  • the sensor module 120 may function to sense, detect, or otherwise measure a reference orientation of the user relative to a target destination.
  • the sensor module 120 may function to sense, detect, or otherwise measure an actual orientation of the user relative to a target destination.
  • the measurement may be on-demand (e.g., based on input at an optional input device 134 or a radio frequency identification (RFID)) or automatically, for example based on activation of an application stored on the physical orientation devices 110 or an application stored on a paired computing device or based on a sensed location (e.g., using a global positioning sensor (GPS), using low-energy Bluetooth, etc.).
  • GPS global positioning sensor
  • An optional input device 134 for example for receiving an input to activate one or more physical orientation devices, otherwise determine a reference orientation independent of one or more physical orientation devices, may be an electronic divot fixer, a rangefinder, an electronic ball marker, an electronic golf glove, a mobile computing device, a wearable device (e.g., watch), a movement detection device (e.g., detection of a foot tap, movement of a finger ring, detection of a club movement, tap of a smart grip, detection of a pattern of movement), a microphone of a computing device (e.g., receiving a verbal command), a club taps device, a specific aim device used to determine a target (e.g., sensor attached or integrated into a rangefinder), an electronic marker on a course or field, a course map API suggests aim, etc.
  • a target e.g., sensor attached or integrated into a rangefinder
  • a course map API suggests aim, etc.
  • An optional input device 134 may also include or be described as a smart device that can perform more than its basic functions and/or is connected to the internet.
  • the optional input device 134 is a mobile computing device.
  • the optional input device 134 is an electronic divot fixer.
  • the optional input device 134 is range finder.
  • the sensor module 120 may include, but not be limited to, one or more of, or a combination of, an inertial measurement unit (IMU), gyroscope, magnetometer, accelerometer, a radar, a laser, an ultra-wideband protocol, a potentiometer, a rotary encoder (e.g., optical, magnetic, etc.), a direct gear-based angle measurement, a ball-and-socket position tracking, a pendulum-based tilt measurement, a physical protractor with position sensor, an electronic inclinometer, a MEMS angular rate sensor, a micro-electro-mechanical system (MEMS), a multi-axis accelerometer, a camera (e.g., for motion tracking), an infrared (IR) sensor, etc.
  • the sensor module 120 includes an IMU. In some embodiments, the sensor module 120 includes an IMU with or without a gyroscope. In some embodiments, the sensor module 120 includes an IMU with or without a magnetometer. In some embodiments, the sensor module 120 includes a multi-axis accelerometer. In some embodiments, the sensor module 120 includes an encoder.
  • the processor 130 may cause activation or inactivation of the sensor module 120 to initiate or halt a sensing activity of the sensor module.
  • the memory 140 may store instructions that are readable, and executable, by the processor 130. For example, as shown and described in connection with FIG. 3A, the processor 130 may execute a calibration process, using a calibration module 392; an orientation process, using orientation determination module 3941; a performance assessment, using optional performance analyzer 396; and/or output an alignment indication 398.
  • processor 130 and/or memory 140 are integrated into physical orientation device 110. In some embodiments of system 100, processor 130 and/or memory 140 are remote from the physical orientation device 110. For example, the processor 130 and/or memory 140 may be located in a communicatively coupled (e.g., wire or wireless connection) mobile computing device (e.g., mobile phone, laptop, etc.), a server, a workstation, a wearable device, etc.
  • a communicatively coupled e.g., wire or wireless connection
  • mobile computing device e.g., mobile phone, laptop, etc.
  • server e.g., a server, a workstation, a wearable device, etc.
  • FIG. 2 shows an example schematic of system 100 including a first physical orientation device 110a and a second physical orientation device 110b. Although two physical orientation devices are shown in FIG. 2, and one physical orientation device 110 is shown in FIG. 1, one of skill in the art will appreciate that a plurality of physical orientation devices may also be used.
  • the first physical orientation device 110a may include a first sensor module 120a
  • the second physical orientation device 110b may include a second sensor module 120b.
  • the first physical orientation device 110a and the second physical orientation device 110b may be operatively or communicatively coupled to one or more processor(s) 130 and one or more memory(ies) 140.
  • the one or more processors 130 may be an advanced reduced instruction set computer (RISC) machine (ARM) processor, a digital signal processor (DSP), a microcontroller, and the like.
  • the one or more memory(ies) 130 may include a non-transitory computer readable medium that stores computer readable instructions for execution by the one or more processors 130.
  • the computer readable instructions executable by the one or more processors 130 may include a method, for example as shown in FIG. 11 and described elsewhere herein.
  • first physical orientation device 110a may include a processor 130 and a memory 140
  • second physical orientation device 110b may include a processor 130 and a memory 140
  • each physical orientation device 110 may include a processor 130 but share a memory 140
  • each physical orientation device 110 may include a memory 140 but share a processor 130.
  • the memory 140 and/or processor 130 may be integrated into the physical orientation device or remote from the physical orientation device, as described elsewhere herein.
  • the signal processing unit 390 may extract orientation data from the sensor data.
  • the signal processing unit 390 may optionally filter out noise in the sensor data.
  • the signal processing unit 390 may optionally enhance one or more features of the sensor data.
  • the signal processing unit 390 may output a processed signal for processing by processor 330 based on instructions stored in memory 340.
  • the processor 330 may read instructions from a computer-readable medium stored in memory 340, such the any of the processes described herein are computer-implemented processes or methods.
  • a calibration process may include, at block SI 110 of FIG. 11, receiving a sensor signal, from the sensor module, indicative of a reference orientation 452 of a user 450 relative to a target destination 454.
  • the target destination may be a physical destination.
  • the target destination may be a landing location (e.g., on a fairway, on a green, in an infield, in an outfield, etc.) for a projectile launched by the user.
  • the target destination may be a retaining destination (e.g., a net, field goal posts, etc.
  • the target destination may be a relationship to one or more other players in a sport, for example a relationship between a pitcher on a pitching mound to a catcher and their mitt behind home plate.
  • the target destination may be a digital marker, for example a technology integrated golf course flag or yardage marker.
  • the target destination may be a GPS location.
  • the target destination may be determined using a standalone rangefinder or a rangefinder integrated into a club, bat, racket, divot fixer, etc.
  • the reference orientation 452 may be indicative of at least a portion of a user 450 aligned with a target destination 454, as shown in FIG. 4A.
  • a target destination 454 is a golf cup, as shown in FIG. 4A
  • a target destination may also be a landing location of a projectile (e.g., ball), for example, on a court, in a field, on a fairway, etc., a retaining location, a physical location, a digital location, and the like.
  • a projectile e.g., ball
  • the target destination 454 may be a scoring or retaining destination (e.g., hoop, cup, hole, etc.) or it may be a longitudinally and/or latitudinally destination indicated by a user (e.g., based on sighting), received by an API (and displayed on a map), or otherwise.
  • the reference orientation 452 may be a vector (i.e., magnitude is incorporated into the determination of the reference orientation) in some embodiments.
  • the reference orientation 452 may be a line (i.e., magnitude is not incorporated into the determination of the reference orientation) in some embodiments.
  • a user may face, orient, or align themselves with a target destination 454 (e.g., a basket, goal, a flag associated with a cup, a cup, location, etc.) and an input device may be activated manually (e.g., by the user) or automatically.
  • a user may face, orient, or align themselves with a target destination 454 (e.g., a basket, goal, a flag associated with a cup, a cup, location, etc.) and a reference orientation 452 may be automatically set when the sensor module 320 detects a cessation or reduction in movement of the user.
  • the device 310 may prompt the user, when it detects a reduction or cessation in movement, to set the reference orientation 452 (e.g., by receiving an input).
  • a user 450 may align themselves with a target destination 454, 456 including a flag associated with a cup or align themselves with the cup to generate a reference orientation 452.
  • a physical orientation device e.g., device 110
  • the lower extremity 458 may be aligned with the target destination 454.
  • the sensor signal may be received based on a manual input or automatically, as described elsewhere herein.
  • the reference orientation may be determined by the calibration module 392 relative to a gravitational ground plane 318 about a yaw axis 312, as shown in FIG. 3B.
  • the calibration module 392 may determine or measure an angle of rotation of the body portion 358 about the yaw axis 312, using for example Euler angles, thereby indicating a heading direction of the body portion 358 in three-dimensional Cartesian space.
  • the heading direction may be used to determine or indicate, or may be substantially the same as, a reference orientation line, in some embodiments.
  • a foot is shown in FIG. 3B, a body portion 358 may also be other than a foot, for example a leg portion, torso, arm, shoulder, etc.
  • the reference orientation may be determined by the calibration module 392 about a yaw axis 312 and one or both of a pitch axis 316 or a roll axis 314, relative to a gravity vector 311 (i.e., vertical axis), as shown in FIG. 3B.
  • the calibration module 392 may determine or measure an angle of rotation of the body portion 358 about the yaw axis 312, thereby indicating a heading direction of the body portion in three-dimensional Cartesian space.
  • the calibration module 392 may further determine an angle of rotation of the body portion 358 about the pitch axis 316, indicating, for example, a slope of the ground plane 318 about the pitch axis 316.
  • the calibration module 392 may further determine an angle of rotation of the body portion 358 about the roll axis 314, indicating, for example, a slope of the ground plane 318 about the roll axis 314.
  • the rotation about the yaw axis 312 and about one or both of the pitch axis 316 or roll axis 314 may be used to determine or indicate a reference orientation line, in some embodiments.
  • the yaw angle, pitch angle, and/or roll angle can be used either individually, or used in a composite of the localized x (roll), y (pitch), and z (yaw) axes to generate a reference orientation value, as shown in FIG. 3B.
  • an orientation of a device on a user may be irrelevant such that Cartesian axes may be determined using the device regardless of the orientation of the device on a user.
  • a reference orientation may be determined by the calibration module 392 using GPS.
  • a physical orientation device or a computing device may receive a location of a target destination (e.g., using an API) and a location of the user, such that a connecting line, a reference orientation, may be determined between the two locations.
  • FIG. 8 shows an exemplary, non-limiting GUI 800 displayed when a calibration process is being executed by calibration module 392.
  • GUI 800 may illustrate one or more target destinations 854, 856 with respect to a background 840, for example a field, fairway, green, ballpark, etc.
  • a first reference orientation 852 (and first actual orientation and first angular offset) may be generated with respect to a first target destination 854; and then a second reference orientation 850 (and second actual orientation and second angular offset) may be generated with respect to a second target destination 856.
  • the first reference orientation 852 and/or the second reference orientation 850 may be a composite reference orientation or a reference orientation based on system parameters, user input, number of sensors used, etc., for example.
  • GUI 800 may illustrate a prior or historical projectile launch or consistency representation 848.
  • the reference orientation 852 may be with respect to a user location, for example represented by a body portion, for example a first foot 844 and a second foot 846. Although two body portions are shown, it is contemplated here that a body portion, two body portions, three body portions, a plurality of body portions, etc. may be shown.
  • the processor 330 may execute an orientation determination process using an orientation determination module 394.
  • the orientation determination process may include monitoring an actual orientation 460 of the user 450 relative to the reference orientation 452, as shown in FIG. 4B.
  • the processor 330 may receive one or more additional sensor signals, from the sensor module 320, indicative of an actual orientation 460 of the user 450 in space, relative to the reference orientation 452.
  • the actual orientation may be determined by the orientation determination module 394 relative to a gravitational ground plane 318 about a yaw axis 312, as shown in FIG. 3B.
  • the orientation determination module 394 may determine or measure an angle of rotation of the body portion 358 about the yaw axis 312, using for example Euler angles, thereby indicating a heading direction of the body portion 358 in three-dimensional Cartesian space.
  • the heading direction may be used to determine or indicate, or may be substantially the same as, an actual orientation line, in some embodiments.
  • a foot is shown in FIG. 3B, a body portion 358 may also be other than a foot, for example a leg portion, torso, arm, shoulder, etc.
  • the actual orientation may be determined by the orientation determination module 394 about a yaw axis 312 and one or both of a pitch axis 316 or a roll axis 314, relative to a gravity vector 311 (i.e., vertical axis), as shown in FIG. 3B.
  • the orientation determination module 394 may determine or measure an angle of rotation of the body portion 358 about the yaw axis 312, thereby indicating a heading direction of the body portion in three-dimensional Cartesian space.
  • the orientation determination module 394 may further determine an angle of rotation of the body portion 358 about the pitch axis 316, indicating a slope of the ground plane 318 about the pitch axis 316.
  • the orientation determination module 394 may further determine an angle of rotation of the body portion 358 about the roll axis 314, indicating, for example, a slope of the ground plane 318 about the roll axis 314, a user lifting one or more toes off the ground plane, etc.
  • the rotation about the yaw axis 312 and about one or both of the pitch axis 316 or roll axis 314 may be used to determine or indicate an actual orientation line, in some embodiments.
  • the yaw angle, pitch angle, and/or roll angle can be used either individually, or used in a composite of the localized x (roll), y (pitch), and z (yaw) axes to generate an actual orientation value, as shown in FIG. 3B.
  • an orientation of a device on a user may be irrelevant such that Cartesian axes may be determined using the device regardless of the orientation of the device on a user.
  • Euler angles may be used.
  • quaternion angles may be used, optionally for example when avoidance of gimbal lock (two axes are parallel to one another) may be advantageous.
  • the processor 330 may determine an angular offset 462 of the actual orientation 460 relative to the reference orientation 452. Said another way, as shown in block SI 130 of FIG.
  • the processor 330 may compare the actual orientation 460 to the reference orientation 452 to identify an angular offset 462 of the actual orientation 460 from the reference orientation 452.
  • an angular offset for example, may be decreased when it is determined that a user is on a sloped ground plane (about the roll axis), although the opposite may also be true (angular offset may be increased).
  • an angular offset for example, may be increased when it is determined that a user is on a sloped ground plane (about the pitch axis), although the opposite may also be true (angular offset may be decreased).
  • the actual orientation 460 may be a line (i.e., magnitude is not incorporated into the determination of the actual orientation) nonparallel to the reference orientation, indicating a current alignment of the user relative to the reference orientation 452.
  • the actual orientation 460 may be a vector, such that a magnitude may be incorporated into the determination of the actual orientation.
  • the angular relationship(s) between the actual orientation and the reference orientation may be independent of fixed points in space.
  • the orientation determination process may further include, as shown in block SI 140 of FIG. 11, determining whether the angular offset 462 is within a predefined range.
  • the method may include substantially continuously receiving the one or more additional sensor signals until the angular offset is within a predefined range.
  • the predefined range may be based on user preferences, a club selection of a user, a statistical model of the user, or the like. For example, as shown in FIG. 4B, the predefined range may be between about 75 degrees to about 105 degrees, about 80 degrees to about 100 degrees, about 85 degrees to about 95 degrees, about 90 degrees, etc.
  • the angular offset 462 may be individualized for the user.
  • an angular offset of 90 degrees may be an idealized angular offset
  • an individualized angular offset may be offset from 90 degrees.
  • an individualized angular offset for a first user may be about 70 degrees to about 75 degrees
  • an individualized angular offset for a second user may be about 85 degrees to about 90 degrees.
  • the calibration module 392 and/or the orientation determination module 394 may receive (e.g., through an API or generated based on user reported outcomes, as shown in FIG. 10) a statistical model of the user.
  • the statistical model of the user may be a gaussian distribution, a repeated measures (RM) map, a statistically weighted map or model, or scatter plot, for example, that maps outcomes (e.g., did the projectile hit the target, did the projectile fade to the left or the right of the target, did the projectile draw to the right or the left of the target, did the projectile hook to the right or the left of the target, did the projectile slice to the left or the right of the target, or did the user miss contact with the projectile also called a shank, etc., as shown in FIG. 10) versus an actual orientation 460 or an angular offset 462 of the user.
  • RM repeated measures
  • the statistical model may further map outcomes based on striking tool selection and angular offset for the selected striking tool.
  • the statistical model may be generated in real-time, over time, during normal play by a user, etc.
  • the statistical model may be generated using a simulator, for example, that the user interacts with to determine a user’s individualized outcomes based on their striking tool selection (e.g., type of club, bat, racket, etc.) and/or relative to their actual orientation and/or their angular offset.
  • the statistical model may be machine-generated based on idealized outcomes or based on another individual, for example a pro-golfer, a coach, an athlete, etc.
  • the calibration module 392 and/or the orientation determination module 394 may receive a statistical model of the user; and determine an angular offset 462, and thereby an actual orientation 460, individualized for the user, based on the received statistical model.
  • the statistical model may be represented by a look up table, such that a striking tool type (e.g., club type) is related to an angular offset or an actual orientation within the look up table.
  • a striking tool type e.g., club type
  • an actual angular offset of a user may differ from the individualized angular offset that is based on the statistical model (and represented in the look up table).
  • the user may be guided (e.g., based on visual cues, haptic cues, auditory cues, or a combination thereof in an application, using the physiological sensor device, or the like) to adjust their actual orientation, and thus their angular offset, so that the actual angular offset approximates or substantially matches the individualized angular offset of the statistical model.
  • each actual angular offset i.e., from each device
  • a composite angular offset may be mapped or normalized to the statistical model, and the user guided to an adjusted actual orientation so that an adjusted actual angular offset approximates or substantially matches the individualized angular offset of the statistical model.
  • a striking tool selection may be based on data from a rangefinder communicatively coupled to, or integrated with, a sensor module.
  • the rangefinder may emit a laser pulse towards a target destination, measure the time it takes for the reflected pulse to return, and determine a distance based on the speed of light and the time interval (time for laser beam to travel to the target and back), thereby enabling or enhancing striking tool selection by a user.
  • sensor data from the sensor module may also be used in striking tool selection, such that the slope may impact the distance to the target and/or a type of striking tool that may be used based on the slope.
  • FIG. 9 shows an exemplary, non-limiting GUI 900 displayed when an orientation determination process is being executed by orientation determination module 394.
  • GUI 900 may illustrate a target destination 954 and an actual orientation 960 with respect to the target destination 954.
  • GUI 900 may display a composite angular offset 920 and/or an actual angular offset 920, 922 of one or more body portions.
  • a composite angular offset 920 of zero may represent an actual orientation 960 of about 90 degrees relative to a reference orientation.
  • an angular offset 922 of a first body portion 944 may represent an actual orientation 960 of about 92 degrees relative to a reference orientation.
  • an angular offset 918 of a second body portion 946 may represent an actual orientation 960 of about 88 degrees relative to a reference orientation.
  • the actual orientation 960 may be with respect to a user location, for example represented by a body portion 944 including, for example a first foot and/or a body portion 946 including a second foot.
  • a body portion 944 including, for example a first foot and/or a body portion 946 including a second foot.
  • the processor 330 may output a positive alignment indication 398 when the angular offset 462 is within a predefined range, as shown in FIG. 4B.
  • the processor 330 may output a negative alignment indication 398 when the angular offset 462 is outside of the predefined range, as shown in FIGs. 4C-4D.
  • the predefined range may be individualized for the user, as described above.
  • the angular offset 462 may be outside the predefined range when the angular offset 462 is greater than about 90 degrees, greater than about 95 degrees, greater than about 100 degrees, greater than about 110 degrees, etc. as shown in FIG. 4D.
  • the angular offset 462 may be outside the predefined range when the angular offset 462 is about 80 degrees to about 90 degrees, about 85 degrees to about 95 degrees, about 90 degrees to about 100 degrees, about 90 degrees to about 95 degrees, about 95 degrees to about 100 degrees, about 100 degrees to about 105 degrees, about 105 degrees to about 110 degrees, etc. as shown in FIG. 4D.
  • the angular offset 462 may be outside the predefined range when the angular offset 462 is less than about 90 degrees, less than about 85 degrees, less than about 80 degrees, less than about 75 degrees, etc. as shown in FIG. 4C.
  • the angular offset 462 may be outside the predefined range when the angular offset 462 is about 80 degrees to about 90 degrees, about 80 degrees to about 85 degrees, about 75 degrees to about 80 degrees, about 70 degrees to about 75 degrees, about 65 degrees to about 70 degrees, etc. as shown in FIG. 4C.
  • the alignment indication 398 may include, but not be limited to, a visual indication, an auditory indication, a haptic indication, or a combination thereof.
  • a visual indication is a numerical indicator on a GUI.
  • a visual indication is a light on the physical orientation device.
  • a haptic indication is a vibration mechanism (e.g., piezoelectric device, etc.) on or in the physical orientation device.
  • a haptic indication is a vibration mechanism (e.g., piezoelectric device, etc.) in another wearable device (e.g., smartwatch, phone, etc.) communicatively coupled to the physical orientation device.
  • the alignment indication 398 may be configurable to be provided on an individual device, multiple devices (e.g., simultaneously, asynchronously, sequentially, etc.), or a separate computing device when present.
  • one or more additional devices 310 may be worn by the user to calculate a confidence level of the reference orientation, the actual orientation, and/or angular offset.
  • a physical orientation device 310 may be used to determine an angular offset and to determine a confidence level of the measured reference orientation, actual orientation, and/or angular offset.
  • the processor 330 may determine a confidence level by comparing the difference between a primary sensor orientation (e.g., determined by a physical orientation device) and one or more additional or secondary sensor orientations (e.g., determined by one or more additional devices) using a confidence function.
  • This confidence function may use thresholding, linear or cubic scales, or a pass/fail algorithm based upon its configuration.
  • a confidence level may be calculated based on a combination of sensor specifications, real-time measurements, and/or error models.
  • a confidence level may quantify a reliability of the sensor data.
  • One or more factors may affect sensor confidence levels include a noise density (i.e., random error in measurements) of sensor readings, a standard offset or a variance of outputs from the sensor, a bias (e.g., constant or consistent offset) in sensor readings, a drift (e.g., gradual changes over time) in sensor readings, resolution of the sensor, sensitivity of the sensor, environmental factors (e.g., temperature variations, vibrations, magnetic disturbances, etc.), compensated sensors (e.g., temperature compensation), etc.
  • noise density i.e., random error in measurements
  • a bias e.g., constant or consistent offset
  • drift e.g., gradual changes over time
  • environmental factors e.g., temperature variations, vibrations, magnetic disturbances, etc.
  • compensated sensors e.g., temperature compensation
  • Confidence level of sensor measurements may be calculated using a covariance matrix (e.g., Kalman Filtering). For example, a Kalman filter estimates the state of a system (e.g., position, orientation) and calculates the covariance matrix, representing the confidence level for each variable. In some embodiments, the smaller the covariance, the higher the confidence in the estimate. Confidence level of sensor measurements may be calculated using measurement residuals or a difference between the sensor's expected output and actual measurement (residual). In some embodiments, the smaller the residuals, the higher the confidence in the actual sensor reading. Confidence level of sensor measurements may be calculated using error models.
  • a covariance matrix e.g., Kalman Filtering
  • an Allan Variance Analysis may be used to characterize gyroscope and accelerometer errors, separating noise types like random walk and bias instability. Errors are modeled as functions of time, allowing confidence levels to be adjusted dynamically.
  • Confidence level of sensor measurements may be calculated using a likelihood estimation. Probabilistic models may be used to compute a likelihood of a measurement being correct, based on prior data and current sensor states. Confidence level of sensor measurements may be calculated using a signal-to-noise ratio (SNR). For example, higher SNR may indicate less noise and more reliable measurements.
  • SNR signal-to-noise ratio
  • processor 330 may further, optionally, receive an input of a user preference.
  • the user preference may adjust a notification type of, for example, a positive alignment indication or a negative alignment indication.
  • processor 330 may execute optional performance analyzer 396.
  • the optional performance analyzer 396 may analyze a user’s angular offset results or history (i.e., performance history) over time.
  • the angular offset results or history may be stored either locally on the device(s) in memory 340, on a separate computing device, or on cloud-based infrastructure (i.e., a remote computing device).
  • the angular offset results or history may be used by the processor 330 to individualize an actual orientation and/or an angular offset of a user.
  • the performance history may be output to a display of the device or a communicatively coupled computing device. For example, FIG.
  • GUI 700 illustrates a graphical user interface (GUI) 700 that shows a user’s history based on various club used, as shown in column 702.
  • the processor has calculated an accuracy (i.e., measurement of error), in column 704; a consistency (e.g., derived from all of the sample points to determine a range between the values, median, mean, and mode), in column 706; a shape, in column 708; and an angular offset, also called aim in this view, in column 710.
  • the shape in column 708 may be based on user reporting, as described elsewhere herein. For example, as shown in FIG. 10, a user may report a resultant shape using GUI 1000.
  • a representation 1010 may be displayed on the GUI 1000 that illustrates each type of possible result.
  • “Lacing” represents a nearly straight or linear shot. “Drawing” and “hooking” to the right or left (depending on side dominance) represents the projectile being offset to the right or left at varying degrees and/or varying shapes. “Fading” or “slicing” to the left or right (depending on side dominance) represents the projectile being offset to the left or right at varying degrees and/or varying shapes.
  • Projectile location representation 1020 may be displayed on GUI 1000 or another GUI for input to be received about whether the projectile was out of bounds 1022 (OB) to the left, OB 1030 to the right, in the rough 1024 to the left, in the rough 1028 to the right, laced 1026 substantially down the center, or shanked 1032 (e.g., shot was such an outlier that it should not count in the dataset).
  • device 310 includes a power management unit 360.
  • the power management unit 360 may provide power to one or more components of device 310.
  • the power management unit 360 may provide power to the processor 330, memory 340, calibration module 392, orientation determination module 394, signal processing unit 390, sensor module 320, optional performance analyzer 396, optional UI (user interface) generator 370, optional data transmission unit 380, and optional display 350.
  • the power management unit 360 may be a battery (replaceable or rechargeable), a solar power generator and associated storage module, a kinetic energy generator and associated storage module, and the like.
  • Device 310 may further include an optional UI generator 370 for generating a GUI for display on optional display 350 to the user, for example of any of the GUIs shown in FIGs. 6-10, described elsewhere herein.
  • an optional UI generator 370 for generating a GUI for display on optional display 350 to the user, for example of any of the GUIs shown in FIGs. 6-10, described elsewhere herein.
  • Device 310 may further include an optional data transmission unit 380, for transmitting sensor data, orientation data, reference orientation data, actual orientation data, etc. to an external device.
  • the external device may be a communicatively or operatively coupled mobile computing device, a wearable device, Cloud infrastructure, server, and the like.
  • device 310 may include two devices or two or more devices.
  • a first device may be worn on a first body portion and a second device may be worn on a second body portion.
  • the first body portion may be a left extremity or a left portion of the body, and the second body portion may be a right extremity or a right portion of the body.
  • processor 330 may execute a calibration process, using calibration module 392, including receiving a composite reference orientation 552 relative to a target destination 554, 556.
  • the composite reference orientation 552 may be based on, or determined by, a first sensor signal from a first device 558a and a second sensor signal from a second device 558b, coupled to a user 550.
  • the composite reference orientation may be an average of the signals received from devices 558a, 558b. The average may be weighted, normalized, a root mean square, or otherwise.
  • a data transmission unit 380 may be included to enable data exchange between two or more devices.
  • the processor 330 may execute an orientation determination process using an orientation determination module 394.
  • the orientation determination process may include monitoring a composite actual orientation 560 of the user 550 relative to the composite reference orientation 552, as shown in FIG. 5B.
  • the processor 330 may receive additional sensor signals, from the sensor modules of the two or more devices 558a, 558b, indicative of a composite actual orientation 560 of the user 550 in space, relative to the reference orientation 552.
  • the composite actual orientation 560 may be an average of the signals received from devices 558a, 558b. The average may be weighted, normalized, or otherwise.
  • weighting may be based on a manual input, for example, from a coach indicating that a user should adjust their actual orientation, place more weight on a right or left foot, square to the projectile, open up relative to the projectile, etc.
  • a composite actual orientation 560 may be weighted or adjusted based on whether a user is on a sloped ground plane, planar ground plane, or an uneven ground plane. For example, if a user is on a steep slope, the more balanced or solid footing foot or leg may be weighted more heavily than the bent or less stabilized foot or leg.
  • receiving additional sensor signals may include substantially continuously receiving the additional sensor signals until the angular offset 562 is within the predefined range.
  • the processor 330 may determine an angular offset 562 of the composite actual orientation 560 from the composite reference orientation 552. Said another way, the processor 330 may compare the composite reference orientation 552 to the composite actual orientation 560 to identify an angular offset 562 of the composite actual orientation 560 from the composite reference orientation 552.
  • the composite actual orientation 560 may be a vector or a line nonparallel to the reference orientation, indicating a current alignment of the user 550 relative to the composite reference orientation 552.
  • the angular relationship(s) between the composite actual orientation and the composite reference orientation may be independent of fixed points in space.
  • the orientation determination process may further include determining whether the angular offset 562 is within a predefined range.
  • the predefined range may be between about 75 degrees to about 105 degrees, about 80 degrees to about 100 degrees, about 85 degrees to about 95 degrees, about 90 degrees, etc.
  • the processor 330 may output a positive alignment indication 398 when the angular offset 562 is within a predefined range, as shown in FIG. 5B.
  • the processor 330 may output a negative alignment indication 398 when the angular offset 562 is outside of the predefined range, as shown in FIGs. 5C-CD.
  • the angular offset 562 may be outside the predefined range when the angular offset 562 is greater than about 90 degrees, greater than about 95 degrees, greater than about 100 degrees, greater than about 110 degrees, etc. as shown in FIG. 5D.
  • the angular offset 562 may be outside the predefined range when the angular offset 562 is less than about 90 degrees, less than about 85 degrees, less than about 80 degrees, less than about 75 degrees, etc. as shown in FIG. 5C.
  • the alignment indication 398 may include, but not be limited to, a visual indication, an auditory indication, or a haptic indication.
  • the alignment indication 398 may be configurable to be provided on an individual device, multiple devices (e.g., simultaneously, asynchronously, sequentially, etc.), or a separate computing device when present.
  • Example 1 A physical orientation system for guiding a user for aiming a sports projectile at a destination, the system comprising: a sensor module configured to detect a physical orientation of at least one body portion of a user; a processor operatively coupled to the sensor module; and a memory operatively coupled to the processor, the memory configured to store instructions that, when executed by the processor, cause the processor to: receive a sensor signal indicative of a reference orientation of the user relative to a target destination; receive one or more additional sensor signals indicative of an actual orientation of the user in space, relative to the reference orientation; compare the actual orientation to the reference orientation to identify an angular offset of the actual orientation relative to the reference orientation; and when the angular offset is within a predefined range, output a positive alignment indication, and when the angular offset is outside of the predefined range, output a negative alignment indication.
  • Example 2 The physical orientation system of any one of the preceding examples, but particularly Example 1, further comprising receiving an input to set the reference orientation.
  • Example 3 The physical orientation system of any one of the preceding examples, but particularly Example 1, wherein the target destination is a landing location for a projectile launched by the user.
  • Example 4 The physical orientation system of any one of the preceding examples, but particularly Example 1, wherein the sensor module is wearable on the at least one body portion.
  • Example 5 The physical orientation system of any one of the preceding examples, but particularly Example 4, wherein the at least one body portion of the user is a lower extremity of the user.
  • Example 6 The physical orientation system of any one of the preceding examples, but particularly Example 4, wherein the at least one body portion of the user is a foot of the user.
  • Example 7 The physical orientation system of any one of the preceding examples, but particularly Example 4, wherein the at least one body portion of the user is a leg of the user.
  • Example 8 The physical orientation system of any one of the preceding examples, but particularly Example 4, wherein the at least one body portion of the user is a torso of the user.
  • Example 9 The physical orientation system of any one of the preceding examples, but particularly Example 1, wherein the receiving the one or more additional sensor signals comprises substantially continuously receiving the one or more additional sensor signals until the angular offset is within the predefined range.
  • Example 10 The physical orientation system of Example 1, wherein the receiving the one or more additional sensor signals comprises receiving a composite sensor signal comprising a first sensor signal and a second sensor signal from the sensor module and a second sensor module, respectively.
  • Example 11 The physical orientation system of any one of the preceding examples, but particularly Example 1, further comprising a first device comprising the sensor module and a second device comprising a second sensor module, wherein one or both of the first device or the second device are configured for radio frequency ranging, ultrasonic ranging, infrared ranging, or global position system (GPS) positioning.
  • a first device comprising the sensor module
  • a second device comprising a second sensor module
  • GPS global position system
  • Example 12 The physical orientation system of any one of the preceding examples, but particularly Example 1, wherein one or both of the first device or the second device comprise an inertial measurement unit (IMU).
  • IMU inertial measurement unit
  • Example 13 The physical orientation system of any one of the preceding examples, but particularly Example 10, wherein the sensor module is worn on a first body portion and the second sensor module is worn on a second body portion.
  • Example 14 The physical orientation system of any one of the preceding examples, but particularly Example 13, wherein the first body portion is a left extremity, and the second body portion is a right extremity.
  • Example 15 The physical orientation system of any one of the preceding examples, but particularly Example 1, wherein the reference orientation comprises a line aligned with the target destination.
  • Example 16 The physical orientation system of any one of the preceding examples, but particularly Example 15, wherein the actual orientation comprises a line nonparallel relative to the reference orientation, indicating a current alignment of the user relative to the reference orientation.
  • Example 17 The physical orientation system of any one of the preceding examples, but particularly Example 1, further comprising receiving an input of a user preference, wherein the user preference is configured to adjust a notification type of one or both of: the positive alignment indication or the negative alignment indication.
  • Example 18 The physical orientation system of any one of the preceding examples, but particularly Example 1, wherein the sensor module comprises at least one of: an accelerometer, a gyroscope, or a magnetometer.
  • Example 19 The physical orientation system of any one of the preceding examples, but particularly Example 1, wherein the positive alignment indication is one of: visual feedback, auditory feedback, haptic feedback, or a combination thereof.
  • Example 20 The physical orientation system of any one of the preceding examples, but particularly Example 1, wherein the negative alignment indication is one of: visual feedback, auditory feedback, haptic feedback, or a combination thereof.
  • Example 21 The physical orientation system of any one of the preceding examples, but particularly Example 1, wherein the sensor module is coupled to the at least one body portion of the user, and the processor and the memory are in a computing device operatively coupled to the sensor module.
  • Example 22 The physical orientation system of any one of the preceding examples, but particularly Example 1, wherein the sensor module, the processor, and the memory are in a wearable device coupled to the at least one body portion of the user.
  • Example 23 The physical orientation system of any one of the preceding examples, but particularly Example 1, further comprising determining the reference orientation by determining an angle of rotation of the at least one body portion relative to a gravitational ground plane about a yaw axis.
  • Example 24 The physical orientation system of any one of the preceding examples, but particularly Example 1, further comprising determining the actual orientation by determining an angle of rotation of the at least one body portion relative to a gravitational ground plane about a yaw axis.
  • Example 25 The physical orientation system of any one of the preceding examples, but particularly Example 1, further comprising determining the reference orientation by determining and compositing a rotation of the at least one body portion about a yaw axis and one or both of: a pitch axis or a roll axis, relative to a gravity vector.
  • Example 26 The physical orientation system of any one of the preceding examples, but particularly Example 1, further comprising determining the actual orientation by determining and compositing a rotation of the at least one body portion about a yaw axis and one or both of: a pitch axis or a roll axis, relative to a gravity vector.
  • Example 27 The physical orientation system of any one of the preceding examples, but particularly Example 1, further comprising individualizing the angular offset for the user.
  • Example 28 The physical orientation system of any one of the preceding examples, but particularly Example 27, wherein the individualization comprises receiving a statistical model of the user; and updating the angular offset, and thereby the actual orientation, based on the statistical model.
  • Example 29 A computer-implemented method for guiding physical orientation of a user for aiming a sports projectile at a destination, comprising: receiving a sensor signal indicative of a reference orientation of the user relative to a target destination; receiving one or more additional sensor signals indicative of an actual orientation of the user in space, relative to the reference orientation; comparing the actual orientation to the reference orientation to identify an angular offset of the actual orientation relative to the reference orientation; and when the angular offset is within a predefined range, outputting a positive alignment indication, and when the angular offset is outside of the predefined range, outputting a negative alignment indication.
  • Example 30 The computer-implemented method of any one of the preceding examples, but particularly Example 30, further comprising receiving an input to set the reference orientation.
  • Example 31 The computer-implemented method of any one of the preceding examples, but particularly Example 30, wherein the target destination is a landing location for a projectile launched by the user.
  • Example 32 The computer-implemented method of any one of the preceding examples, but particularly Example 30, wherein the sensor module is wearable on the at least one body portion.
  • Example 33 The computer-implemented method of any one of the preceding examples, but particularly Example 33, wherein the at least one body portion of the user is a lower extremity of the user.
  • Example 34 The computer-implemented method of any one of the preceding examples, but particularly Example 33, wherein the at least one body portion of the user is a foot of the user.
  • Example 35 The computer-implemented method of any one of the preceding examples, but particularly Example 33, wherein the at least one body portion of the user is a leg of the user.
  • Example 36 The computer-implemented method of any one of the preceding examples, but particularly Example 33, wherein the at least one body portion of the user is a torso of the user.
  • Example 37 The computer-implemented method of any one of the preceding examples, but particularly Example 30, wherein the receiving the one or more additional sensor signals comprises substantially continuously receiving the one or more additional sensor signals until the angular offset is within the predefined range.
  • Example 38 The computer-implemented method of any one of the preceding examples, but particularly Example 30, wherein the receiving the one or more additional sensor signals comprises receiving a composite sensor signal comprising a first sensor signal and a second sensor signal from the sensor module and a second sensor module, respectively.
  • Example 39 The computer-implemented method of any one of the preceding examples, but particularly Example 38, further comprising a first device comprising the sensor module and a second device comprising a second sensor module, wherein one or both of the first device or the second device are configured for radio frequency ranging, ultrasonic ranging, infrared ranging, or global position system (GPS) positioning.
  • GPS global position system
  • Example 40 The computer-implemented method of any one of the preceding examples, but particularly Example 38, wherein one or both of the first device or the second device comprise an inertial measurement unit (IMU).
  • IMU inertial measurement unit
  • Example 41 The computer-implemented method of any one of the preceding examples, but particularly Example 37, wherein the sensor module is worn on a first body portion and the second sensor module is worn on a second body portion.
  • Example 42 The computer-implemented method of any one of the preceding examples, but particularly Example 41, wherein the first body portion is a left extremity, and the second body portion is a right extremity.
  • Example 43 The computer-implemented method of any one of the preceding examples, but particularly Example 29, wherein the reference orientation comprises a line aligned with the target destination.
  • Example 44 The computer-implemented method of any one of the preceding examples, but particularly Example 43, wherein the actual orientation comprises a line nonparallel relative to the reference orientation, indicating a current alignment of the user relative to the reference orientation.
  • Example 45 The computer-implemented method of any one of the preceding examples, but particularly Example 29, further comprising receiving an input of a user preference, wherein the user preference is configured to adjust a notification type of one or both of: the positive alignment indication or the negative alignment indication.
  • Example 46 The computer-implemented method of any one of the preceding examples, but particularly Example 29, wherein the sensor module comprises at least one of: an accelerometer, a gyroscope, or a magnetometer.
  • Example 47 The computer-implemented method of any one of the preceding examples, but particularly Example 29, wherein the positive alignment indication is one of: visual feedback, auditory feedback, haptic feedback, or a combination thereof.
  • Example 48 The computer-implemented method of any one of the preceding examples, but particularly Example 29, wherein the negative alignment indication is one of: visual feedback, auditory feedback, haptic feedback, or a combination thereof.
  • Example 49 The computer-implemented method of any one of the preceding examples, but particularly Example 29, wherein the sensor module is coupled to the at least one body portion of the user, and the processor and the memory are in a computing device operatively coupled to the sensor module.
  • Example 50 The computer-implemented method of any one of the preceding examples, but particularly Example 30, wherein the sensor module, the processor, and the memory are in a wearable device coupled to the at least one body portion of the user.
  • Example 51 The computer-implemented method of any one of the preceding examples, but particularly Example 29, further comprising determining the reference orientation by determining an angle of rotation of the at least one body portion relative to a gravitational ground plane about a yaw axis.
  • Example 52 The computer-implemented method of any one of the preceding examples, but particularly Example 29, further comprising determining the actual orientation by determining an angle of rotation of the at least one body portion relative to a gravitational ground plane about a yaw axis.
  • Example 53 The computer-implemented method of any one of the preceding examples, but particularly Example 29, further comprising determining the reference orientation by determining and compositing a rotation of the at least one body portion about a yaw axis and one or both of: a pitch axis or a roll axis, relative to a gravity vector.
  • Example 54 The computer-implemented method of any one of the preceding examples, but particularly Example 29, further comprising determining the actual orientation by determining and compositing a rotation of the at least one body portion about a yaw axis and one or both of: a pitch axis or a roll axis, relative to a gravity vector.
  • Example 55 The computer-implemented method of any one of the preceding examples, but particularly Example 29, further comprising individualizing the angular offset for the user.
  • Example 56 The computer-implemented method of any one of the preceding examples, but particularly Example 55, wherein the individualization comprises receiving a statistical model of the user; and updating the angular offset, and thereby the actual orientation, based on the statistical model.
  • Example 57 A non-transitory computer readable medium configured to store computer readable instructions that, when read by a processor, cause the processor to execute operations for guiding physical orientation of a user for aiming a sports projectile at a destination, the operations comprising: receiving a sensor signal indicative of a reference orientation of the user relative to a target destination; receiving one or more additional sensor signals indicative of an actual orientation of the user in space, relative to the reference orientation; comparing the actual orientation to the reference orientation to identify an angular offset of the actual orientation relative to the reference orientation; and when the angular offset is within a predefined range, outputting a positive alignment indication, and when the angular offset is outside of the predefined range, outputting a negative alignment indication.
  • Example 58 The non-transitory computer readable medium of any one of the preceding examples, but particularly Example 57, further comprising receiving an input to set the reference orientation.
  • Example 59 The non-transitory computer readable medium of any one of the preceding examples, but particularly Example 57, wherein the target destination is a landing location for a projectile launched by the user.
  • Example 60 The non-transitory computer readable medium of any one of the preceding examples, but particularly Example 57, wherein the sensor module is wearable on the at least one body portion.
  • Example 61 The non-transitory computer readable medium of any one of the preceding examples, but particularly Example 60, wherein the at least one body portion of the user is a lower extremity of the user.
  • Example 62 The non-transitory computer readable medium of any one of the preceding examples, but particularly Example 60, wherein the at least one body portion of the user is a foot of the user.
  • Example 63 The non-transitory computer readable medium of any one of the preceding examples, but particularly Example 60, wherein the at least one body portion of the user is a leg of the user.
  • Example 64 The non-transitory computer readable medium of any one of the preceding examples, but particularly Example 60, wherein the at least one body portion of the user is a torso of the user.
  • Example 65 The non-transitory computer readable medium of any one of the preceding examples, but particularly Example 57, wherein the receiving the one or more additional sensor signals comprises substantially continuously receiving the one or more additional sensor signals until the angular offset is within the predefined range.
  • Example 66 The non-transitory computer readable medium of any one of the preceding examples, but particularly Example 57, wherein the receiving the one or more additional sensor signals comprises receiving a composite sensor signal comprising a first sensor signal and a second sensor signal from the sensor module and a second sensor module, respectively.
  • Example 67 The non-transitory computer readable medium of any one of the preceding examples, but particularly Example 66, further comprising a first device comprising the sensor module and a second device comprising a second sensor module, wherein one or both of the first device or the second device are configured for radio frequency ranging, ultrasonic ranging, infrared ranging, or global position system (GPS) positioning.
  • GPS global position system
  • Example 68 The non-transitory computer readable medium of any one of the preceding examples, but particularly Example 66, wherein one or both of the first device or the second device comprise an inertial measurement unit (IMU).
  • IMU inertial measurement unit
  • Example 69 The non-transitory computer readable medium of any one of the preceding examples, but particularly Example 65, wherein the sensor module is worn on a first body portion and the second sensor module is worn on a second body portion.
  • Example 70 The non-transitory computer readable medium of any one of the preceding examples, but particularly Example 69, wherein the first body portion is a left extremity, and the second body portion is a right extremity.
  • Example 71 The non-transitory computer readable medium of any one of the preceding examples, but particularly Example 57, wherein the reference orientation comprises a line aligned with the target destination.
  • Example 72 The non-transitory computer readable medium of any one of the preceding examples, but particularly Example 71, wherein the actual orientation comprises a line nonparallel relative to the reference orientation, indicating a current alignment of the user relative to the reference orientation.
  • Example 73 The non-transitory computer readable medium of any one of the preceding examples, but particularly Example 57, further comprising receiving an input of a user preference, wherein the user preference is configured to adjust a notification type of one or both of the positive alignment indication or the negative alignment indication.
  • Example 74 The non-transitory computer readable medium of any one of the preceding examples, but particularly Example 57, wherein the sensor module comprises at least one of an accelerometer, a gyroscope, or a magnetometer.
  • Example 75 The non-transitory computer readable medium of any one of the preceding examples, but particularly Example 57, wherein the positive alignment indication is one of: visual feedback, auditory feedback, haptic feedback, or a combination thereof.
  • Example 76 The non-transitory computer readable medium of any one of the preceding examples, but particularly Example 57, wherein the negative alignment indication is one of: visual feedback, auditory feedback, haptic feedback, or a combination thereof.
  • Example 77 The non-transitory computer readable medium of any one of the preceding examples, but particularly Example 57, wherein the negative alignment indication is one of: visual feedback, auditory feedback, haptic feedback, or a combination thereof.
  • Example 57 The non-transitory computer readable medium of any one of the preceding examples, but particularly Example 57, wherein the sensor module is coupled to the at least one body portion of the user, and the processor and the memory are in a computing device operatively coupled to the sensor module.
  • Example 78 The non-transitory computer readable medium of any one of the preceding examples, but particularly Example 57, wherein the sensor module, the processor, and the memory are in a wearable device coupled to the at least one body portion of the user.
  • Example 79 The non-transitory computer readable medium of any one of the preceding examples, but particularly Example 57, further comprising determining the reference orientation by determining an angle of rotation of the at least one body portion relative to a gravitational ground plane about a yaw axis.
  • Example 80 The non-transitory computer readable medium of any one of the preceding examples, but particularly Example 57, further comprising determining the actual orientation by determining an angle of rotation of the at least one body portion relative to a gravitational ground plane about a yaw axis.
  • Example 81 The non-transitory computer readable medium of any one of the preceding examples, but particularly Example 57, further comprising determining the reference orientation by determining and compositing a rotation of the at least one body portion about a yaw axis and one or both of: a pitch axis or a roll axis, relative to a gravity vector.
  • Example 82 The non-transitory computer readable medium of any one of the preceding examples, but particularly Example 57, further comprising determining the actual orientation by determining and compositing a rotation of the at least one body portion about a yaw axis and one or both of: a pitch axis or a roll axis, relative to a gravity vector.
  • Example 83 The non-transitory computer readable medium of any one of the preceding examples, but particularly Example 57, further comprising individualizing the angular offset for the user.
  • Example 84 The non-transitory computer readable medium of any one of the preceding examples, but particularly Example 83, wherein the individualization comprises receiving a statistical model of the user; and updating the angular offset, and thereby the actual orientation, based on the statistical model.
  • the systems and methods of the preferred embodiment and variations thereof can be embodied and/or implemented at least in part as a machine configured to receive a computer- readable medium storing computer-readable instructions.
  • the instructions are preferably executed by computer-executable components preferably integrated with the system and one or more portions of the processor on a physiological orientation device, and/or computing device.
  • the computer-readable medium can be stored on any suitable computer-readable media such as RAMs, ROMs, flash memory, EEPROMs, optical devices (e.g., CD or DVD), hard drives, floppy drives, or any suitable device.
  • the computer-executable component is preferably a general or application-specific processor, but any suitable dedicated hardware or hardware/firmware combination can alternatively or additionally execute the instructions.
  • references in the specification to “one embodiment,” “an embodiment,” “an illustrative embodiment,” “some embodiments,” etc., indicate that the embodiment described may include a particular feature, structure, or characteristic, but every embodiment may or may not necessarily include that particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same embodiment. Further, when a particular feature, structure, or characteristic is described in connection with an embodiment, it is submitted that it is within the knowledge of one skilled in the art to effect such feature, structure, or characteristic in connection with other embodiments whether or not explicitly described.
  • the singular form “a”, “an” and “the” include both singular and plural references unless the context clearly dictates otherwise.
  • the term “device” may include, and is contemplated to include, a plurality of devices, or the term “orientation” may include, and is contemplated to include, a plurality of orientations.
  • the claims and disclosure may include terms such as “a plurality,” “one or more,” or “at least one;” however, the absence of such terms is not intended to mean, and should not be interpreted to mean, that a plurality is not conceived.
  • the term “comprising” or “comprises” is intended to mean that the devices, systems, and methods include the recited elements, and may additionally include any other elements.
  • “Consisting essentially of’ shall mean that the devices, systems, and methods include the recited elements and exclude other elements of essential significance to the combination for the stated purpose. Thus, a system or method consisting essentially of the elements as defined herein would not exclude other materials, features, or steps that do not materially affect the basic and novel characteristic(s) of the claimed disclosure.
  • Consisting of’ shall mean that the devices, systems, and methods include the recited elements and exclude anything more than a trivial or inconsequential element or step. Embodiments defined by each of these transitional terms are within the scope of this disclosure.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Pathology (AREA)
  • Theoretical Computer Science (AREA)
  • Educational Administration (AREA)
  • Physiology (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Biophysics (AREA)
  • Educational Technology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

A physical orientation guiding system may include a sensor module for detecting a physical orientation of at least one body portion of a user. A system may receive a sensor signal indicative of a reference orientation of the user relative to a target destination, receive one or more additional sensor signals indicative of an actual orientation of the user in space, relative to the reference orientation, compare the actual orientation to the reference orientation to identify an angular offset of the actual orientation relative to the reference orientation; and when the angular offset is within a predefined range, output a positive alignment indication, and when the angular offset is outside of the predefined range, output a negative alignment indication.

Description

GUIDING A PHYSICAL ORIENTATION OF A
USER DURING SPORTS PLAY
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This application claims the priority benefit of U.S. Provisional Patent Application Ser. No. 63/624,726, filed January 24, 2024, which is herein incorporated by reference in its entirety.
INCORPORATION BY REFERENCE
[0002] All publications and patent applications mentioned in this specification are herein incorporated by reference in their entirety, as if each individual publication or patent application was specifically and individually indicated to be incorporated by reference in its entirety.
TECHNICAL FIELD
[0003] This disclosure relates to systems for assisting physical alignment and orientation in sports that involve precision targeting of projectiles, such as, but not limited to, golf, baseball, basketball, football, or archery. Specifically, this disclosure pertains to real-time feedback systems using wearable sensors to guide users in achieving alignment with a predetermined target.
BACKGROUND
[0004] Aiming and executing a physical act towards an intended target is a fundamental component of many sports. This process, however, is more than just visual; it is whole body encompassing and is impacted both by best practices that can be coached and individuals’ historical activities. Current market offerings that rely solely on visual aiming and alignment practice miss components for developing a holistic and effective skill set for launching a sports projectile at a desired destination.
SUMMARY
[0005] In some aspects, the techniques described herein relate to a physical orientation system for guiding a user for aiming a sports projectile at a destination, the system including: a sensor module configured to detect a physical orientation of at least one body portion of a user; a processor operatively coupled to the sensor module; and a memory operatively coupled to the processor, the memory configured to store instructions that, when executed by the processor, cause the processor to: receive a sensor signal indicative of a reference orientation of the user relative to a target destination; receive one or more additional sensor signals indicative of an actual orientation of the user in space, relative to the reference orientation; compare the actual orientation to the reference orientation to identify an angular offset of the actual orientation relative to the reference orientation; and when the angular offset is within a predefined range, output a positive alignment indication, and when the angular offset is outside of the predefined range, output a negative alignment indication. [0006] In some aspects, the techniques described herein relate to a computer-implemented method for guiding physical orientation of a user for aiming a sports projectile at a destination, including: receiving a sensor signal indicative of a reference orientation of the user relative to a target destination; receiving one or more additional sensor signals indicative of an actual orientation of the user in space, relative to the reference orientation; comparing the actual orientation to the reference orientation to identify an angular offset of the actual orientation relative to the reference orientation; and when the angular offset is within a predefined range, outputting a positive alignment indication, and when the angular offset is outside of the predefined range, outputting a negative alignment indication.
[0007] In some aspects, the techniques described herein relate to a non-transitory computer readable medium configured to store computer readable instructions that, when read by a processor, cause the processor to execute operations for guiding physical orientation of a user for aiming a sports projectile at a destination, the operations including: receiving a sensor signal indicative of a reference orientation of the user relative to a target destination; receiving one or more additional sensor signals indicative of an actual orientation of the user in space, relative to the reference orientation; comparing the actual orientation to the reference orientation to identify an angular offset of the actual orientation relative to the reference orientation; and when the angular offset is within a predefined range, outputting a positive alignment indication, and when the angular offset is outside of the predefined range, outputting a negative alignment indication.
BRIEF DESCRIPTION OF THE DRAWINGS
[0008] The foregoing is a summary, and thus, necessarily limited in detail. The above- mentioned aspects, as well as other aspects, features, and advantages of the present technology are described below in connection with various embodiments, with reference made to the accompanying drawings.
[0009] FIG. 1 shows a schematic of an embodiment of a system for determining a physical orientation of a user during sports play.
[0010] FIG. 2 shows a schematic of an embodiment of a system for determining a physical orientation of a user during sports play.
[0011] FIG. 3 A shows a schematic of an embodiment of a device system for determining a physical orientation of a user during sports play.
[0012] FIG. 3B shows a schematic of orientation axes relative to a body portion.
[0013] FIG. 4A shows a schematic of an embodiment of at least a portion of a method of using a device for physical orientation of a user.
[0014] FIG. 4B shows a schematic of an embodiment of at least a portion of a method of using a device for physical orientation of a user.
[0015] FIG. 4C shows a schematic of an embodiment of at least a portion of a method of using a device for physical orientation of a user.
[0016] FIG. 4D shows a schematic of an embodiment of at least a portion of a method of using a device for physical orientation of a user.
[0017] FIG. 5 A shows a schematic of an embodiment of at least a portion of a method of using one or more devices for physical orientation of a user.
[0018] FIG. 5B shows a schematic of an embodiment of at least a portion of a method of using one or more devices for physical orientation of a user.
[0019] FIG. 5C shows a schematic of an embodiment of at least a portion of a method of using one or more devices for physical orientation of a user.
[0020] FIG. 5D shows a schematic of an embodiment of at least a portion of a method of using one or more devices for physical orientation of a user.
[0021] FIG. 6 shows an embodiment of a user interface for displaying a location of one or more sensor modules.
[0022] FIG. 7 shows an embodiment of a user interface for displaying various user statistics.
[0023] FIG. 8 shows an embodiment of a user interface for displaying a reference orientation of the user.
[0024] FIG. 9 shows an embodiment of a user interface for displaying an actual orientation of the user. [0025] FIG. 10 shows an embodiment of a user interface for displaying or receiving an output of a projective launch.
[0026] FIG. 11 shows a flow diagram of a physical orientation method for aiming a sports projectile at a destination.
[0027] The illustrated embodiments are merely examples and are not intended to limit the disclosure. The schematics are drawn to illustrate features and concepts and are not necessarily drawn to scale.
DETAILED DESCRIPTION
[0028] The foregoing is a summary, and thus, necessarily limited in detail. The above- mentioned aspects, as well as other aspects, features, and advantages of the present technology will now be described in connection with various embodiments. The inclusion of the following embodiments is not intended to limit the disclosure to these embodiments, but rather to enable any person skilled in the art to make and use the claimed subject matter. Other embodiments may be utilized, and modifications may be made without departing from the spirit or scope of the subject matter presented herein. Aspects of the disclosure, as described and illustrated herein, can be arranged, combined, modified, and designed in a variety of different formulations, all of which are explicitly contemplated and form part of this disclosure.
[0029] Orientation set up in athletic performance is fundamental to an athlete’s success, irrespective of the sport. Some factors influencing orientation include identification of the intended target and the athlete's body alignment relative to the intended target. These factors play a role in launching a sports projectile toward a destination. For instance, each athlete’s eyesight and dominant eye significantly alters their line of sight, making each aiming opportunity a challenge to execute physical movements effectively. A general, one-size-fits- all solution fails to address these nuances. The devices and methods described herein solve the above technical problems by using training methods that integrate individualized visual and physical elements to improve aim to a target, acknowledging the individuality of each athlete's physical and perceptual characteristics.
[0030] Physical literacy in skill development, which encompasses the development of motor skills and physical competence, is often useful for long-term adherence to physical activity and sport. Research highlights that skill development, particularly during childhood and adolescence, impacts sustained engagement in sports and physical activities throughout adulthood. Current assessment practices have limitations in addressing process-related skills, often overlooking the nuances of individual development. Traditional physical alignment tools, such as rods, markers, and reference lines placed on playing surfaces, have long served as standard positional aids across multiple sports, relying on the user’s subjective visual interpretation. The devices and methods described herein solve the above technical problems by providing tailored assessments and objective, recorded positional feedback that improves alignment of a user with a target, to better fit the needs of athletes and promote skill mastery at early stages.
[0031] Further, while visual training aids are helpful, they fail to replicate the tactile and immersive learning used for mastery. Visceral skill development provides athletes with the tools to internalize movement patterns and adapt to environmental variables, such as surface contour and body orientation, in real-time. The devices and methods described herein solve the above technical problems by aiding users in internalizing movement patterns and related adaptions for improved alignment to a target.
[0032] Conventional solutions for alignment in sports often fall short in providing reliable and adaptable feedback. For example, waistband-based orientation systems use a single sensor placed in a waistband to measure rotational orientation. These systems provide angular offset feedback parallel to the target using haptic cues. However, this approach has significant limitations. For example, these waistband-based orientation systems suffer from axis variability, meaning as the user bends or moves, the axis of the sensor changes, leading to inaccurate rotational readings. Further, conventional systems have limited target approach. In other words, evaluating angular offset parallel to the target does not account for the holistic body alignment that is used for sports like golf. Static alignment tools, like alignment sticks and other manual aids, require setup, provide no real-time feedback, are prone to movement during use, and are dependent on a user’s visual perception. Conventional alignment analysis systems lack the ability to calculate a composite orientation, leading to less accurate guidance for users with complex body movements.
[0033] The devices and systems described herein solve the above technical problems with technical solutions. For example, the devices and systems described herein provide for improved aiming of sports projectile(s) at a target or a destination. In particular, the devices and methods described herein offer real-time physical orientation feedback using one or more sensor modules (e.g., an orientation sensor, inertial measurement unit, gyroscope, positional sensor, angular sensor, etc.). In some embodiments, the one or more sensor modules may be wearable on one or both feet of a user. For example, the devices and methods may generate an angular offset relative to a target or destination based on measurements from one or more sensor modules. For example, in golf, the devices and methods may generate a substantially perpendicular angular offset relative to a target or destination based on measurements from one or more orientation sensor modules on the one or both feet of a user. The angular offset may be individualized such that the offset is non-perpendicular. The angular offset may be individualized based on user preference and/or user history. In at least some implementations of the methods and devices described herein, using an angular offset based on one or more sensor modules coupled, for example, to one or both feet of a user, may improve alignment precision by reducing inaccuracies caused by body movements, such as bending (e.g., from a sensor placed on a torso of a user), that can affect rotational accuracy. However, the devices and methods described herein may be used on any one or more portions of a body, which may be dependent on an intended sport of the user. Further, in some embodiments, as described elsewhere herein, one or more additional sensors may be used for calibration or confidence level determinations. Such additional sensors may be in a similar location as the orientation determination sensor(s) (e.g., foot, leg, ankle, etc.) or at a separate location from the orientation determination sensor(s) (e.g., torso, leg, arm, hip, etc.).
[0034] The device and methods described herein may also employ two or more, or a plurality of sensors to calculate or determine a composite orientation. For example, a composite orientation may simplify user feedback, instead of, for example, providing feedback to a user for each sensor of two or more or a plurality of sensors, which in some instances, may confuse a user. In some embodiments, users may receive real-time feedback in the form of visual, haptic, or auditory signals, which may guide the user toward achieving alignment with the target. In some embodiments, the system may store performance data (e.g., user or individualized performance data or performance data related to desired movement or alignment patterns) or receive performance data (e.g., using an application programming interface (API)), enabling trend analysis, customized training plans, and/or individualized alignment (e.g., angular offset individualized for the user).
[0035] The devices and methods described herein solve the technical problems experienced with conventional systems, such as errors in rotational data caused by mobile or highly mobile body portions (e.g., torso) during a sport process, by utilizing a dynamic, portable, sensorized device or system coupled to, for example, one or both feet, one or both legs, or the like. In some embodiments, the devices and methods described herein may be positioned on one or both feet of a user, such that the impact of swinging, bending, or other torsional movements of the user during setup or game play may have a reduced effect on reference orientation determination, actual orientation determination, and/or angular offset determination. In some embodiments, the devices and methods described herein may determine a reference orientation, actual orientation, and/or angular offset of the body portion, for example one or both feet, relative to a ground plane using a single axis (i.e., about a yaw axis as shown in FIG. 3B). In some embodiments, the devices and methods described herein may additionally determine a reference orientation, actual orientation, and/or angular offset of the body portion about two or more axes, for example one or both of a pitch axis or tilt axis, as shown in FIG. 3B, to account for a slope or unevenness of a terrain on which the user is positioned (or standing). The devices and methods described herein may be used in sports like golf (e.g., target destination may be a location on a fairway, a flag, a cup, a location on a green, etc.), baseball (e.g., target destination may be a left field location, a plate, a right field location, a center field location, a glove, etc.), tennis (e.g., target destination may be a left baseline, a right baseline, a middle centerline, etc.), pickleball, etc., where proper body alignment, relative to a target, improves consistency and accuracy in performance.
[0036] FIG. 1 shows a schematic of an embodiment of an example system 100 for determining a physical orientation of a user during sports play. A system 100 may include a physical orientation device 110, a sensor module 120, a processor 130, a memory 140, and optionally an input device 134. The sensor module 120 may at least partially within the physical orientation device 110, or the sensor module 120 may be communicatively or electrically (e.g., Bluetooth, Wi-Fi, wireless, near-field, etc.) coupled to the physical orientation device 110. The physical orientation device 110 may be operatively or communicatively coupled (e.g., using a wired or wireless connection) to the processor 130 and the memory 140. In some embodiments, the processor 130 and the memory 140 are at least partially within the physical orientation device 110. Alternatively, the processor 130 and the memory 140 may be at least partially within a computing device communicatively coupled to the physical orientation device 110. The physical orientation device 110 may be removably coupled to a body portion or at least a portion of a body of a user.
[0037] The body portion, to which the physical orientation device 110 may be coupled, can be displayed on a graphical user interface (GUI) 600. For example, as shown in FIG. 6, GUI 600 shows two example locations (of a plurality of locations) for placement of a sensor module 610 on the shoes and/or a sensor module 620 on the socks. Optionally, a status of each device may also be displayed on the GUI 600. For example, a first sensor module 612 and a second sensor module 614 may be shown with a status, for example “ready”, “paired”, “unpaired”, “not ready”, “not detected”, “detected”, “connected”, “not connected”, etc. When a particular location, for example, socks, is not in use, the GUI may be grayed out or dimmed and the first sensor module 616 and the second sensor module 618 may not show a status or may show a status indicating that these sensor locations are not in use.
[0038] Although “shoes” and “socks” are shown in GUI 600 or FIG. 6, a plurality of locations for sensor attachment are contemplated herein. For example, a physical orientation device 110 may be removably coupled to, or integrated into, a sock, a footwear, an insole, a lace of a footwear, a tongue of a footwear, a knee brace, a portion of a legwear (e.g., shorts, pant leg, ), a belt loop, a portion of a belt, a pocket, a headwear (e.g., hat, visor, headband, helmet, etc.), a collar, a sleeve of a clothing item, a portion of a face worn item (e.g., hearing aid, glasses, sunglasses, etc.), a neckband or necklace, an ear worn device (e.g., an earpiece, earbud), a portion of a chest strap (like heart rate monitors), a portion of shoulder harness, a portion of a bra, a portion of a clothing item (e.g., undershirt, compression shirt, etc.), a portion of a handwear (e.g., glove, ring, etc.), a portion of a wrist (e.g., watch-style, etc.), a forearm strap, a bicep band, a portion of an elbow support, a portion of an armband, a heel cup, a toe cap, an arch support, etc. In some embodiments, a physical orientation device 110 may be removably coupled to a foot of a user, for example, directly to the foot or indirectly using a shoe, bracelet, sock, or other item coupled to the foot of the user. In some embodiments, a first physical orientation device 110 may be removably coupled to a first foot of a user and a second physical orientation device 110 may be removably coupled to a second foot of a user. For example, either or both can be directly coupled to the foot or indirectly coupled to the foot using a shoe, bracelet, sock, or other item coupled to the foot of the user. [0039] The physical orientation device 110 may be removably couplable to a portion of a back of a user (e.g., between shoulder blades, upper back, etc.), a portion of a sternum of a user, a foot, an ankle, a lower leg portion, a portion of a hip, a portion of a thigh, a portion of a torso, etc. The physical orientation device 110 may be integrated into an adhesive skin patch.
[0040] A position of a physical orientation device 110 on a user may be dependent on a use case (i.e., what needs to be measured or determined) of the physical orientation device 110. If the physical orientation device 110 is measuring an orientation of a body portion in space, the physical orientation device 110 may be coupled to a foot, a leg, a shoulder, an elbow, etc. of a user. If a first physical orientation device 110a is measuring a confidence level or calibrating a second physical orientation device 110b, the first physical orientation device 110a may be coupled to a same location or a different location as the second physical orientation device 110b (e.g., a foot, a leg, a shoulder, an elbow, a torso, a chest, etc.).
[0041] The physical orientation device 110 may include a magnetic attachment. The physical orientation device 110 may include a hook-and-loop mounting system. The physical orientation device 110 may include a clip-on mechanism. The physical orientation device 110 may include an attachment point or multiple attachment points. The physical orientation device 110 may be a wearable device, a virtual reality device, a wearable device operatively coupled to a computing device, a wearable device operatively coupled to a server, or a nonwearable device (e.g., camera or image sensor, rangefinder, etc.) manipulated by a user.
[0042] Although a physical orientation device 110 is shown, one of skill in the art will appreciate that any number of physical orientation devices 110 may be used. For example, a user may don a physical orientation device 110, two physical orientation devices 110, three physical orientation devices 110, four physical orientation devices 110, one to five physical orientation devices 110, five to ten physical orientation devices 110, etc. For example, in some embodiments, a first physical orientation device may be donned by a user and a second physical orientation device may be used by, held by, or otherwise manipulated by a user but not donned by the user.
[0043] The sensor module 120 may function to sense, detect, or otherwise measure a reference orientation of the user relative to a target destination. The sensor module 120 may function to sense, detect, or otherwise measure an actual orientation of the user relative to a target destination. The measurement may be on-demand (e.g., based on input at an optional input device 134 or a radio frequency identification (RFID)) or automatically, for example based on activation of an application stored on the physical orientation devices 110 or an application stored on a paired computing device or based on a sensed location (e.g., using a global positioning sensor (GPS), using low-energy Bluetooth, etc.).
[0044] An optional input device 134, for example for receiving an input to activate one or more physical orientation devices, otherwise determine a reference orientation independent of one or more physical orientation devices, may be an electronic divot fixer, a rangefinder, an electronic ball marker, an electronic golf glove, a mobile computing device, a wearable device (e.g., watch), a movement detection device (e.g., detection of a foot tap, movement of a finger ring, detection of a club movement, tap of a smart grip, detection of a pattern of movement), a microphone of a computing device (e.g., receiving a verbal command), a club taps device, a specific aim device used to determine a target (e.g., sensor attached or integrated into a rangefinder), an electronic marker on a course or field, a course map API suggests aim, etc. An optional input device 134 may also include or be described as a smart device that can perform more than its basic functions and/or is connected to the internet. In some embodiments, the optional input device 134 is a mobile computing device. In some embodiments, the optional input device 134 is an electronic divot fixer. In some embodiments, the optional input device 134 is range finder.
[0045] The sensor module 120 may include, but not be limited to, one or more of, or a combination of, an inertial measurement unit (IMU), gyroscope, magnetometer, accelerometer, a radar, a laser, an ultra-wideband protocol, a potentiometer, a rotary encoder (e.g., optical, magnetic, etc.), a direct gear-based angle measurement, a ball-and-socket position tracking, a pendulum-based tilt measurement, a physical protractor with position sensor, an electronic inclinometer, a MEMS angular rate sensor, a micro-electro-mechanical system (MEMS), a multi-axis accelerometer, a camera (e.g., for motion tracking), an infrared (IR) sensor, etc. In some embodiments, the sensor module 120 includes an IMU. In some embodiments, the sensor module 120 includes an IMU with or without a gyroscope. In some embodiments, the sensor module 120 includes an IMU with or without a magnetometer. In some embodiments, the sensor module 120 includes a multi-axis accelerometer. In some embodiments, the sensor module 120 includes an encoder.
[0046] The processor 130 may cause activation or inactivation of the sensor module 120 to initiate or halt a sensing activity of the sensor module. The memory 140 may store instructions that are readable, and executable, by the processor 130. For example, as shown and described in connection with FIG. 3A, the processor 130 may execute a calibration process, using a calibration module 392; an orientation process, using orientation determination module 3941; a performance assessment, using optional performance analyzer 396; and/or output an alignment indication 398.
[0047] In some embodiments of system 100, processor 130 and/or memory 140 are integrated into physical orientation device 110. In some embodiments of system 100, processor 130 and/or memory 140 are remote from the physical orientation device 110. For example, the processor 130 and/or memory 140 may be located in a communicatively coupled (e.g., wire or wireless connection) mobile computing device (e.g., mobile phone, laptop, etc.), a server, a workstation, a wearable device, etc.
[0048] FIG. 2 shows an example schematic of system 100 including a first physical orientation device 110a and a second physical orientation device 110b. Although two physical orientation devices are shown in FIG. 2, and one physical orientation device 110 is shown in FIG. 1, one of skill in the art will appreciate that a plurality of physical orientation devices may also be used. The first physical orientation device 110a may include a first sensor module 120a, and the second physical orientation device 110b may include a second sensor module 120b. The first physical orientation device 110a and the second physical orientation device 110b may be operatively or communicatively coupled to one or more processor(s) 130 and one or more memory(ies) 140. The one or more processors 130 may be an advanced reduced instruction set computer (RISC) machine (ARM) processor, a digital signal processor (DSP), a microcontroller, and the like. The one or more memory(ies) 130 may include a non-transitory computer readable medium that stores computer readable instructions for execution by the one or more processors 130. The computer readable instructions executable by the one or more processors 130 may include a method, for example as shown in FIG. 11 and described elsewhere herein.
[0049] Although a processor 130 and a memory 140 are shown, one of skill in the art will appreciate that the first physical orientation device 110a may include a processor 130 and a memory 140, and/or the second physical orientation device 110b may include a processor 130 and a memory 140. Alternatively, each physical orientation device 110 may include a processor 130 but share a memory 140. Alternatively, each physical orientation device 110 may include a memory 140 but share a processor 130. The memory 140 and/or processor 130 may be integrated into the physical orientation device or remote from the physical orientation device, as described elsewhere herein.
[0050] FIG. 3 A shows an example schematic of an embodiment of a device 310 for determining a physical orientation of a user during sports play. Device 310 may function to measure or determine and indicate dynamic angular relationships between a user’s actual orientation relative to a reference orientation. Device 310 may be operatively, communicatively, or otherwise electrically coupled to one or more processors 330 and one or more memories 340. Although processor 330 and memory 340 are shown within device 310 in FIG. 3 A, one of skill in the art will appreciate that processor 330 and memory 340 may be separate from device but in electrical communication with device 310. Further, as described elsewhere herein, device 310 includes one or more sensor modules 320. The signal processing unit 390 of device 310 may receive digital sensor data. The signal processing unit 390 may extract orientation data from the sensor data. The signal processing unit 390 may optionally filter out noise in the sensor data. The signal processing unit 390 may optionally enhance one or more features of the sensor data. The signal processing unit 390 may output a processed signal for processing by processor 330 based on instructions stored in memory 340. The processor 330 may read instructions from a computer-readable medium stored in memory 340, such the any of the processes described herein are computer-implemented processes or methods.
[0051] As shown in FIGs. 3A and 11, the processor 330 may execute a calibration process using calibration module 392. For example, a calibration process may include, at block SI 110 of FIG. 11, receiving a sensor signal, from the sensor module, indicative of a reference orientation 452 of a user 450 relative to a target destination 454. The target destination may be a physical destination. For example, the target destination may be a landing location (e.g., on a fairway, on a green, in an infield, in an outfield, etc.) for a projectile launched by the user. The target destination may be a retaining destination (e.g., a net, field goal posts, etc. The target destination may be a relationship to one or more other players in a sport, for example a relationship between a pitcher on a pitching mound to a catcher and their mitt behind home plate. The target destination may be a digital marker, for example a technology integrated golf course flag or yardage marker. The target destination may be a GPS location. The target destination may be determined using a standalone rangefinder or a rangefinder integrated into a club, bat, racket, divot fixer, etc.
[0052] The reference orientation 452 may be indicative of at least a portion of a user 450 aligned with a target destination 454, as shown in FIG. 4A. Although the target destination 454 is a golf cup, as shown in FIG. 4A, a target destination may also be a landing location of a projectile (e.g., ball), for example, on a court, in a field, on a fairway, etc., a retaining location, a physical location, a digital location, and the like. The target destination 454 may be a scoring or retaining destination (e.g., hoop, cup, hole, etc.) or it may be a longitudinally and/or latitudinally destination indicated by a user (e.g., based on sighting), received by an API (and displayed on a map), or otherwise. The reference orientation 452 may be a vector (i.e., magnitude is incorporated into the determination of the reference orientation) in some embodiments. The reference orientation 452 may be a line (i.e., magnitude is not incorporated into the determination of the reference orientation) in some embodiments. For example, a user may face, orient, or align themselves with a target destination 454 (e.g., a basket, goal, a flag associated with a cup, a cup, location, etc.) and an input device may be activated manually (e.g., by the user) or automatically. Alternatively, a user may face, orient, or align themselves with a target destination 454 (e.g., a basket, goal, a flag associated with a cup, a cup, location, etc.) and a reference orientation 452 may be automatically set when the sensor module 320 detects a cessation or reduction in movement of the user. In some embodiments, the device 310 may prompt the user, when it detects a reduction or cessation in movement, to set the reference orientation 452 (e.g., by receiving an input). In a non-limiting example, a user 450 may align themselves with a target destination 454, 456 including a flag associated with a cup or align themselves with the cup to generate a reference orientation 452. In some embodiments, as shown in FIG. 4A, where a physical orientation device (e.g., device 110) may be worn on a lower extremity 458, the lower extremity 458 may be aligned with the target destination 454. The sensor signal may be received based on a manual input or automatically, as described elsewhere herein.
[0053] The reference orientation may be determined by the calibration module 392 relative to a gravitational ground plane 318 about a yaw axis 312, as shown in FIG. 3B. For example, the calibration module 392 may determine or measure an angle of rotation of the body portion 358 about the yaw axis 312, using for example Euler angles, thereby indicating a heading direction of the body portion 358 in three-dimensional Cartesian space. The heading direction may be used to determine or indicate, or may be substantially the same as, a reference orientation line, in some embodiments. Although a foot is shown in FIG. 3B, a body portion 358 may also be other than a foot, for example a leg portion, torso, arm, shoulder, etc.
[0054] In some embodiments, the reference orientation may be determined by the calibration module 392 about a yaw axis 312 and one or both of a pitch axis 316 or a roll axis 314, relative to a gravity vector 311 (i.e., vertical axis), as shown in FIG. 3B. For example, the calibration module 392 may determine or measure an angle of rotation of the body portion 358 about the yaw axis 312, thereby indicating a heading direction of the body portion in three-dimensional Cartesian space. The calibration module 392 may further determine an angle of rotation of the body portion 358 about the pitch axis 316, indicating, for example, a slope of the ground plane 318 about the pitch axis 316. The calibration module 392 may further determine an angle of rotation of the body portion 358 about the roll axis 314, indicating, for example, a slope of the ground plane 318 about the roll axis 314. The rotation about the yaw axis 312 and about one or both of the pitch axis 316 or roll axis 314 may be used to determine or indicate a reference orientation line, in some embodiments. Said another way, the yaw angle, pitch angle, and/or roll angle can be used either individually, or used in a composite of the localized x (roll), y (pitch), and z (yaw) axes to generate a reference orientation value, as shown in FIG. 3B. In some embodiments, an orientation of a device on a user may be irrelevant such that Cartesian axes may be determined using the device regardless of the orientation of the device on a user.
[0055] In some embodiments, a reference orientation may be determined by the calibration module 392 using GPS. For example, a physical orientation device or a computing device may receive a location of a target destination (e.g., using an API) and a location of the user, such that a connecting line, a reference orientation, may be determined between the two locations.
[0056] FIG. 8 shows an exemplary, non-limiting GUI 800 displayed when a calibration process is being executed by calibration module 392. For example, GUI 800 may illustrate one or more target destinations 854, 856 with respect to a background 840, for example a field, fairway, green, ballpark, etc. When multiple target destinations 854, 856 are shown, a first reference orientation 852 (and first actual orientation and first angular offset) may be generated with respect to a first target destination 854; and then a second reference orientation 850 (and second actual orientation and second angular offset) may be generated with respect to a second target destination 856. The first reference orientation 852 and/or the second reference orientation 850 may be a composite reference orientation or a reference orientation based on system parameters, user input, number of sensors used, etc., for example. Optionally, GUI 800 may illustrate a prior or historical projectile launch or consistency representation 848. The reference orientation 852 may be with respect to a user location, for example represented by a body portion, for example a first foot 844 and a second foot 846. Although two body portions are shown, it is contemplated here that a body portion, two body portions, three body portions, a plurality of body portions, etc. may be shown.
[0057] As shown in FIGs. 3A and 11, the processor 330 may execute an orientation determination process using an orientation determination module 394. The orientation determination process may include monitoring an actual orientation 460 of the user 450 relative to the reference orientation 452, as shown in FIG. 4B. In other words, as shown in block SI 120 of FIG. 11, the processor 330 may receive one or more additional sensor signals, from the sensor module 320, indicative of an actual orientation 460 of the user 450 in space, relative to the reference orientation 452.
[0058] The actual orientation may be determined by the orientation determination module 394 relative to a gravitational ground plane 318 about a yaw axis 312, as shown in FIG. 3B. For example, the orientation determination module 394 may determine or measure an angle of rotation of the body portion 358 about the yaw axis 312, using for example Euler angles, thereby indicating a heading direction of the body portion 358 in three-dimensional Cartesian space. The heading direction may be used to determine or indicate, or may be substantially the same as, an actual orientation line, in some embodiments. Although a foot is shown in FIG. 3B, a body portion 358 may also be other than a foot, for example a leg portion, torso, arm, shoulder, etc.
[0059] In some embodiments, the actual orientation may be determined by the orientation determination module 394 about a yaw axis 312 and one or both of a pitch axis 316 or a roll axis 314, relative to a gravity vector 311 (i.e., vertical axis), as shown in FIG. 3B. For example, the orientation determination module 394 may determine or measure an angle of rotation of the body portion 358 about the yaw axis 312, thereby indicating a heading direction of the body portion in three-dimensional Cartesian space. The orientation determination module 394 may further determine an angle of rotation of the body portion 358 about the pitch axis 316, indicating a slope of the ground plane 318 about the pitch axis 316. The orientation determination module 394 may further determine an angle of rotation of the body portion 358 about the roll axis 314, indicating, for example, a slope of the ground plane 318 about the roll axis 314, a user lifting one or more toes off the ground plane, etc. The rotation about the yaw axis 312 and about one or both of the pitch axis 316 or roll axis 314 may be used to determine or indicate an actual orientation line, in some embodiments. Said another way, the yaw angle, pitch angle, and/or roll angle can be used either individually, or used in a composite of the localized x (roll), y (pitch), and z (yaw) axes to generate an actual orientation value, as shown in FIG. 3B. In some embodiments, an orientation of a device on a user may be irrelevant such that Cartesian axes may be determined using the device regardless of the orientation of the device on a user. In some multi-axis determinations of the actual orientation, Euler angles may be used. In some multi-axis determinations of the actual orientation, quaternion angles may be used, optionally for example when avoidance of gimbal lock (two axes are parallel to one another) may be advantageous. [0060] The processor 330 may determine an angular offset 462 of the actual orientation 460 relative to the reference orientation 452. Said another way, as shown in block SI 130 of FIG. 11, the processor 330 may compare the actual orientation 460 to the reference orientation 452 to identify an angular offset 462 of the actual orientation 460 from the reference orientation 452. In some embodiments in considering single axis versus multi-axis orientation determination, an angular offset, for example, may be decreased when it is determined that a user is on a sloped ground plane (about the roll axis), although the opposite may also be true (angular offset may be increased). In some embodiments, an angular offset, for example, may be increased when it is determined that a user is on a sloped ground plane (about the pitch axis), although the opposite may also be true (angular offset may be decreased).
[0061] The actual orientation 460 may be a line (i.e., magnitude is not incorporated into the determination of the actual orientation) nonparallel to the reference orientation, indicating a current alignment of the user relative to the reference orientation 452. The actual orientation 460 may be a vector, such that a magnitude may be incorporated into the determination of the actual orientation. The angular relationship(s) between the actual orientation and the reference orientation may be independent of fixed points in space.
[0062] The orientation determination process may further include, as shown in block SI 140 of FIG. 11, determining whether the angular offset 462 is within a predefined range. In some embodiments, the method may include substantially continuously receiving the one or more additional sensor signals until the angular offset is within a predefined range. The predefined range may be based on user preferences, a club selection of a user, a statistical model of the user, or the like. For example, as shown in FIG. 4B, the predefined range may be between about 75 degrees to about 105 degrees, about 80 degrees to about 100 degrees, about 85 degrees to about 95 degrees, about 90 degrees, etc. The angular offset 462 may be individualized for the user. Said another way, although an angular offset of 90 degrees, for example, may be an idealized angular offset, in reality, an individualized angular offset may be offset from 90 degrees. For example, an individualized angular offset for a first user may be about 70 degrees to about 75 degrees, while an individualized angular offset for a second user may be about 85 degrees to about 90 degrees. To individualize the angular offset 462, the calibration module 392 and/or the orientation determination module 394 may receive (e.g., through an API or generated based on user reported outcomes, as shown in FIG. 10) a statistical model of the user. The statistical model of the user may be a gaussian distribution, a repeated measures (RM) map, a statistically weighted map or model, or scatter plot, for example, that maps outcomes (e.g., did the projectile hit the target, did the projectile fade to the left or the right of the target, did the projectile draw to the right or the left of the target, did the projectile hook to the right or the left of the target, did the projectile slice to the left or the right of the target, or did the user miss contact with the projectile also called a shank, etc., as shown in FIG. 10) versus an actual orientation 460 or an angular offset 462 of the user. Optionally, the statistical model may further map outcomes based on striking tool selection and angular offset for the selected striking tool. The statistical model may be generated in real-time, over time, during normal play by a user, etc. The statistical model may be generated using a simulator, for example, that the user interacts with to determine a user’s individualized outcomes based on their striking tool selection (e.g., type of club, bat, racket, etc.) and/or relative to their actual orientation and/or their angular offset. The statistical model may be machine-generated based on idealized outcomes or based on another individual, for example a pro-golfer, a coach, an athlete, etc.
[0063] In some embodiments, the calibration module 392 and/or the orientation determination module 394 may receive a statistical model of the user; and determine an angular offset 462, and thereby an actual orientation 460, individualized for the user, based on the received statistical model. For example, the statistical model may be represented by a look up table, such that a striking tool type (e.g., club type) is related to an angular offset or an actual orientation within the look up table. In some embodiments, an actual angular offset of a user may differ from the individualized angular offset that is based on the statistical model (and represented in the look up table). Accordingly, the user may be guided (e.g., based on visual cues, haptic cues, auditory cues, or a combination thereof in an application, using the physiological sensor device, or the like) to adjust their actual orientation, and thus their angular offset, so that the actual angular offset approximates or substantially matches the individualized angular offset of the statistical model. Said another way, a selected striking tool type (e.g., by the user, input into an interface of an application and/or physiological orientation device, recommended by the application, etc.) and/or an actual angular offset of a user may be mapped to a range or value of individualized angular offsets identified by the statistical model (related to the selected striking tool type), such that the actual angular offset can be replaced by, overridden by, or adjusted to be the individualized angular offset, by guiding the user towards an adjusted actual orientation so that an adjusted actual angular offset approximates or substantially matches the individualized angular offset of the statistical model.
[0064] When more than one sensor module (and optionally more than one) physiological orientation device) is used, each actual angular offset (i.e., from each device) or a composite angular offset may be mapped or normalized to the statistical model, and the user guided to an adjusted actual orientation so that an adjusted actual angular offset approximates or substantially matches the individualized angular offset of the statistical model.
[0065] In some embodiments, there may be more than one value of angular offset or a plurality of values of angular offsets per striking tool such that an actual angular offset is mapped to a value of the more than one values or the plurality of values, or the actual angular offset is mapped to an average or median of the more than one values, the plurality of values or a subset of the values.
[0066] In some embodiments, a striking tool selection may be based on data from a rangefinder communicatively coupled to, or integrated with, a sensor module. For example, the rangefinder may emit a laser pulse towards a target destination, measure the time it takes for the reflected pulse to return, and determine a distance based on the speed of light and the time interval (time for laser beam to travel to the target and back), thereby enabling or enhancing striking tool selection by a user. In some embodiments, for example when the projectile is on a slope, sensor data from the sensor module (integrated into the range finder or worn by a user, for example) may also be used in striking tool selection, such that the slope may impact the distance to the target and/or a type of striking tool that may be used based on the slope.
[0067] FIG. 9 shows an exemplary, non-limiting GUI 900 displayed when an orientation determination process is being executed by orientation determination module 394. For example, GUI 900 may illustrate a target destination 954 and an actual orientation 960 with respect to the target destination 954. GUI 900 may display a composite angular offset 920 and/or an actual angular offset 920, 922 of one or more body portions. As shown in GUI 900 of FIG. 9, a composite angular offset 920 of zero may represent an actual orientation 960 of about 90 degrees relative to a reference orientation. For example, an angular offset 922 of a first body portion 944 may represent an actual orientation 960 of about 92 degrees relative to a reference orientation. Further, for example, an angular offset 918 of a second body portion 946 may represent an actual orientation 960 of about 88 degrees relative to a reference orientation. The actual orientation 960 may be with respect to a user location, for example represented by a body portion 944 including, for example a first foot and/or a body portion 946 including a second foot. Although two body portions are shown, it is contemplated here that a body portion, two body portions, three body portions, a plurality of body portions, etc. may be shown.
[0068] As shown in FIG. 3A and block SI 140 of FIG. 11, the processor 330 may output a positive alignment indication 398 when the angular offset 462 is within a predefined range, as shown in FIG. 4B. The processor 330 may output a negative alignment indication 398 when the angular offset 462 is outside of the predefined range, as shown in FIGs. 4C-4D. The predefined range may be individualized for the user, as described above. For example, the angular offset 462 may be outside the predefined range when the angular offset 462 is greater than about 90 degrees, greater than about 95 degrees, greater than about 100 degrees, greater than about 110 degrees, etc. as shown in FIG. 4D. For example, the angular offset 462 may be outside the predefined range when the angular offset 462 is about 80 degrees to about 90 degrees, about 85 degrees to about 95 degrees, about 90 degrees to about 100 degrees, about 90 degrees to about 95 degrees, about 95 degrees to about 100 degrees, about 100 degrees to about 105 degrees, about 105 degrees to about 110 degrees, etc. as shown in FIG. 4D. For example, the angular offset 462 may be outside the predefined range when the angular offset 462 is less than about 90 degrees, less than about 85 degrees, less than about 80 degrees, less than about 75 degrees, etc. as shown in FIG. 4C. For example, the angular offset 462 may be outside the predefined range when the angular offset 462 is about 80 degrees to about 90 degrees, about 80 degrees to about 85 degrees, about 75 degrees to about 80 degrees, about 70 degrees to about 75 degrees, about 65 degrees to about 70 degrees, etc. as shown in FIG. 4C. The alignment indication 398 may include, but not be limited to, a visual indication, an auditory indication, a haptic indication, or a combination thereof. In some embodiments, a visual indication is a numerical indicator on a GUI. In some embodiments, a visual indication is a light on the physical orientation device. In some embodiments, a haptic indication is a vibration mechanism (e.g., piezoelectric device, etc.) on or in the physical orientation device. In some embodiments, a haptic indication is a vibration mechanism (e.g., piezoelectric device, etc.) in another wearable device (e.g., smartwatch, phone, etc.) communicatively coupled to the physical orientation device. The alignment indication 398 may be configurable to be provided on an individual device, multiple devices (e.g., simultaneously, asynchronously, sequentially, etc.), or a separate computing device when present. [0069] In some embodiments, one or more additional devices 310 may be worn by the user to calculate a confidence level of the reference orientation, the actual orientation, and/or angular offset. Alternatively, a physical orientation device 310 may be used to determine an angular offset and to determine a confidence level of the measured reference orientation, actual orientation, and/or angular offset. The processor 330 may determine a confidence level by comparing the difference between a primary sensor orientation (e.g., determined by a physical orientation device) and one or more additional or secondary sensor orientations (e.g., determined by one or more additional devices) using a confidence function. This confidence function may use thresholding, linear or cubic scales, or a pass/fail algorithm based upon its configuration. For example, when the orientation (reference or actual) and/or angular offset of the primary sensor module are within a threshold range of the orientation (reference or actual) and/or angular offset of the secondary sensor module, a confidence level may be positive or indicate that the primary sensor module values can be trusted. Alternatively, when the orientation (reference or actual) and/or angular offset of the primary sensor module are outside of a threshold range of the orientation (reference or actual) and/or angular offset of the secondary sensor module, a confidence level may be negative or indicate that the primary sensor module values cannot be trusted. The threshold range may be about 0 degrees to about 5 degrees, about 2 degrees to about 7 degrees, about 4 degrees to about 8 degrees, about 0 degrees to about 10 degrees, about 1 degree to about 2 degrees, about 2 degrees to about 4 degrees, etc.
[0070] For example, a confidence level may be calculated based on a combination of sensor specifications, real-time measurements, and/or error models. A confidence level may quantify a reliability of the sensor data. One or more factors may affect sensor confidence levels include a noise density (i.e., random error in measurements) of sensor readings, a standard offset or a variance of outputs from the sensor, a bias (e.g., constant or consistent offset) in sensor readings, a drift (e.g., gradual changes over time) in sensor readings, resolution of the sensor, sensitivity of the sensor, environmental factors (e.g., temperature variations, vibrations, magnetic disturbances, etc.), compensated sensors (e.g., temperature compensation), etc.
[0071] Confidence level of sensor measurements may be calculated using a covariance matrix (e.g., Kalman Filtering). For example, a Kalman filter estimates the state of a system (e.g., position, orientation) and calculates the covariance matrix, representing the confidence level for each variable. In some embodiments, the smaller the covariance, the higher the confidence in the estimate. Confidence level of sensor measurements may be calculated using measurement residuals or a difference between the sensor's expected output and actual measurement (residual). In some embodiments, the smaller the residuals, the higher the confidence in the actual sensor reading. Confidence level of sensor measurements may be calculated using error models. For example, an Allan Variance Analysis may be used to characterize gyroscope and accelerometer errors, separating noise types like random walk and bias instability. Errors are modeled as functions of time, allowing confidence levels to be adjusted dynamically. Confidence level of sensor measurements may be calculated using a likelihood estimation. Probabilistic models may be used to compute a likelihood of a measurement being correct, based on prior data and current sensor states. Confidence level of sensor measurements may be calculated using a signal-to-noise ratio (SNR). For example, higher SNR may indicate less noise and more reliable measurements.
[0072] In some embodiments, processor 330 may further, optionally, receive an input of a user preference. The user preference may adjust a notification type of, for example, a positive alignment indication or a negative alignment indication.
[0073] In some embodiments, processor 330 may execute optional performance analyzer 396. The optional performance analyzer 396 may analyze a user’s angular offset results or history (i.e., performance history) over time. The angular offset results or history may be stored either locally on the device(s) in memory 340, on a separate computing device, or on cloud-based infrastructure (i.e., a remote computing device). The angular offset results or history may be used by the processor 330 to individualize an actual orientation and/or an angular offset of a user. In some embodiments, the performance history may be output to a display of the device or a communicatively coupled computing device. For example, FIG. 7, illustrates a graphical user interface (GUI) 700 that shows a user’s history based on various club used, as shown in column 702. For each club used, the processor has calculated an accuracy (i.e., measurement of error), in column 704; a consistency (e.g., derived from all of the sample points to determine a range between the values, median, mean, and mode), in column 706; a shape, in column 708; and an angular offset, also called aim in this view, in column 710. The shape in column 708 may be based on user reporting, as described elsewhere herein. For example, as shown in FIG. 10, a user may report a resultant shape using GUI 1000. A representation 1010 may be displayed on the GUI 1000 that illustrates each type of possible result. “Lacing” represents a nearly straight or linear shot. “Drawing” and “hooking” to the right or left (depending on side dominance) represents the projectile being offset to the right or left at varying degrees and/or varying shapes. “Fading” or “slicing” to the left or right (depending on side dominance) represents the projectile being offset to the left or right at varying degrees and/or varying shapes. Projectile location representation 1020 may be displayed on GUI 1000 or another GUI for input to be received about whether the projectile was out of bounds 1022 (OB) to the left, OB 1030 to the right, in the rough 1024 to the left, in the rough 1028 to the right, laced 1026 substantially down the center, or shanked 1032 (e.g., shot was such an outlier that it should not count in the dataset). In some embodiments, device 310 includes a power management unit 360. The power management unit 360 may provide power to one or more components of device 310. For example, the power management unit 360 may provide power to the processor 330, memory 340, calibration module 392, orientation determination module 394, signal processing unit 390, sensor module 320, optional performance analyzer 396, optional UI (user interface) generator 370, optional data transmission unit 380, and optional display 350. The power management unit 360 may be a battery (replaceable or rechargeable), a solar power generator and associated storage module, a kinetic energy generator and associated storage module, and the like.
[0074] Device 310 may further include an optional UI generator 370 for generating a GUI for display on optional display 350 to the user, for example of any of the GUIs shown in FIGs. 6-10, described elsewhere herein.
[0075] Device 310 may further include an optional data transmission unit 380, for transmitting sensor data, orientation data, reference orientation data, actual orientation data, etc. to an external device. For example, the external device may be a communicatively or operatively coupled mobile computing device, a wearable device, Cloud infrastructure, server, and the like.
[0076] In some embodiments, as shown in FIGs. 5A-5D, device 310 may include two devices or two or more devices. A first device may be worn on a first body portion and a second device may be worn on a second body portion. The first body portion may be a left extremity or a left portion of the body, and the second body portion may be a right extremity or a right portion of the body. As shown in FIGs. 5A-5D, processor 330 may execute a calibration process, using calibration module 392, including receiving a composite reference orientation 552 relative to a target destination 554, 556. The composite reference orientation 552 may be based on, or determined by, a first sensor signal from a first device 558a and a second sensor signal from a second device 558b, coupled to a user 550. The composite reference orientation may be an average of the signals received from devices 558a, 558b. The average may be weighted, normalized, a root mean square, or otherwise. In some embodiments, when multiple devices are used, a data transmission unit 380 may be included to enable data exchange between two or more devices.
[0077] The processor 330 may execute an orientation determination process using an orientation determination module 394. The orientation determination process may include monitoring a composite actual orientation 560 of the user 550 relative to the composite reference orientation 552, as shown in FIG. 5B. In other words, the processor 330 may receive additional sensor signals, from the sensor modules of the two or more devices 558a, 558b, indicative of a composite actual orientation 560 of the user 550 in space, relative to the reference orientation 552. The composite actual orientation 560 may be an average of the signals received from devices 558a, 558b. The average may be weighted, normalized, or otherwise. In some embodiments, weighting may be based on a manual input, for example, from a coach indicating that a user should adjust their actual orientation, place more weight on a right or left foot, square to the projectile, open up relative to the projectile, etc. A composite actual orientation 560 may be weighted or adjusted based on whether a user is on a sloped ground plane, planar ground plane, or an uneven ground plane. For example, if a user is on a steep slope, the more balanced or solid footing foot or leg may be weighted more heavily than the bent or less stabilized foot or leg.
[0078] In some embodiments, receiving additional sensor signals may include substantially continuously receiving the additional sensor signals until the angular offset 562 is within the predefined range. The processor 330 may determine an angular offset 562 of the composite actual orientation 560 from the composite reference orientation 552. Said another way, the processor 330 may compare the composite reference orientation 552 to the composite actual orientation 560 to identify an angular offset 562 of the composite actual orientation 560 from the composite reference orientation 552. The composite actual orientation 560 may be a vector or a line nonparallel to the reference orientation, indicating a current alignment of the user 550 relative to the composite reference orientation 552. The angular relationship(s) between the composite actual orientation and the composite reference orientation may be independent of fixed points in space. The orientation determination process may further include determining whether the angular offset 562 is within a predefined range. For example, as shown in FIG. 5B, the predefined range may be between about 75 degrees to about 105 degrees, about 80 degrees to about 100 degrees, about 85 degrees to about 95 degrees, about 90 degrees, etc. The processor 330 may output a positive alignment indication 398 when the angular offset 562 is within a predefined range, as shown in FIG. 5B. The processor 330 may output a negative alignment indication 398 when the angular offset 562 is outside of the predefined range, as shown in FIGs. 5C-CD. For example, the angular offset 562 may be outside the predefined range when the angular offset 562 is greater than about 90 degrees, greater than about 95 degrees, greater than about 100 degrees, greater than about 110 degrees, etc. as shown in FIG. 5D. For example, the angular offset 562 may be outside the predefined range when the angular offset 562 is less than about 90 degrees, less than about 85 degrees, less than about 80 degrees, less than about 75 degrees, etc. as shown in FIG. 5C. The alignment indication 398 may include, but not be limited to, a visual indication, an auditory indication, or a haptic indication. The alignment indication 398 may be configurable to be provided on an individual device, multiple devices (e.g., simultaneously, asynchronously, sequentially, etc.), or a separate computing device when present.
[0079] EXAMPLES
[0080] Example 1. A physical orientation system for guiding a user for aiming a sports projectile at a destination, the system comprising: a sensor module configured to detect a physical orientation of at least one body portion of a user; a processor operatively coupled to the sensor module; and a memory operatively coupled to the processor, the memory configured to store instructions that, when executed by the processor, cause the processor to: receive a sensor signal indicative of a reference orientation of the user relative to a target destination; receive one or more additional sensor signals indicative of an actual orientation of the user in space, relative to the reference orientation; compare the actual orientation to the reference orientation to identify an angular offset of the actual orientation relative to the reference orientation; and when the angular offset is within a predefined range, output a positive alignment indication, and when the angular offset is outside of the predefined range, output a negative alignment indication.
[0081] Example 2. The physical orientation system of any one of the preceding examples, but particularly Example 1, further comprising receiving an input to set the reference orientation.
[0082] Example 3. The physical orientation system of any one of the preceding examples, but particularly Example 1, wherein the target destination is a landing location for a projectile launched by the user. [0083] Example 4. The physical orientation system of any one of the preceding examples, but particularly Example 1, wherein the sensor module is wearable on the at least one body portion.
[0084] Example 5. The physical orientation system of any one of the preceding examples, but particularly Example 4, wherein the at least one body portion of the user is a lower extremity of the user.
[0085] Example 6. The physical orientation system of any one of the preceding examples, but particularly Example 4, wherein the at least one body portion of the user is a foot of the user.
[0086] Example 7. The physical orientation system of any one of the preceding examples, but particularly Example 4, wherein the at least one body portion of the user is a leg of the user.
[0087] Example 8. The physical orientation system of any one of the preceding examples, but particularly Example 4, wherein the at least one body portion of the user is a torso of the user.
[0088] Example 9. The physical orientation system of any one of the preceding examples, but particularly Example 1, wherein the receiving the one or more additional sensor signals comprises substantially continuously receiving the one or more additional sensor signals until the angular offset is within the predefined range.
[0089] Example 10. The physical orientation system of Example 1, wherein the receiving the one or more additional sensor signals comprises receiving a composite sensor signal comprising a first sensor signal and a second sensor signal from the sensor module and a second sensor module, respectively.
[0090] Example 11. The physical orientation system of any one of the preceding examples, but particularly Example 1, further comprising a first device comprising the sensor module and a second device comprising a second sensor module, wherein one or both of the first device or the second device are configured for radio frequency ranging, ultrasonic ranging, infrared ranging, or global position system (GPS) positioning.
[0091] Example 12. The physical orientation system of any one of the preceding examples, but particularly Example 1, wherein one or both of the first device or the second device comprise an inertial measurement unit (IMU). [0092] Example 13. The physical orientation system of any one of the preceding examples, but particularly Example 10, wherein the sensor module is worn on a first body portion and the second sensor module is worn on a second body portion.
[0093] Example 14. The physical orientation system of any one of the preceding examples, but particularly Example 13, wherein the first body portion is a left extremity, and the second body portion is a right extremity.
[0094] Example 15. The physical orientation system of any one of the preceding examples, but particularly Example 1, wherein the reference orientation comprises a line aligned with the target destination.
[0095] Example 16. The physical orientation system of any one of the preceding examples, but particularly Example 15, wherein the actual orientation comprises a line nonparallel relative to the reference orientation, indicating a current alignment of the user relative to the reference orientation.
[0096] Example 17. The physical orientation system of any one of the preceding examples, but particularly Example 1, further comprising receiving an input of a user preference, wherein the user preference is configured to adjust a notification type of one or both of: the positive alignment indication or the negative alignment indication.
[0097] Example 18. The physical orientation system of any one of the preceding examples, but particularly Example 1, wherein the sensor module comprises at least one of: an accelerometer, a gyroscope, or a magnetometer.
[0098] Example 19. The physical orientation system of any one of the preceding examples, but particularly Example 1, wherein the positive alignment indication is one of: visual feedback, auditory feedback, haptic feedback, or a combination thereof.
[0099] Example 20. The physical orientation system of any one of the preceding examples, but particularly Example 1, wherein the negative alignment indication is one of: visual feedback, auditory feedback, haptic feedback, or a combination thereof.
[00100] Example 21. The physical orientation system of any one of the preceding examples, but particularly Example 1, wherein the sensor module is coupled to the at least one body portion of the user, and the processor and the memory are in a computing device operatively coupled to the sensor module.
[00101] Example 22. The physical orientation system of any one of the preceding examples, but particularly Example 1, wherein the sensor module, the processor, and the memory are in a wearable device coupled to the at least one body portion of the user. [00102] Example 23. The physical orientation system of any one of the preceding examples, but particularly Example 1, further comprising determining the reference orientation by determining an angle of rotation of the at least one body portion relative to a gravitational ground plane about a yaw axis.
[00103] Example 24. The physical orientation system of any one of the preceding examples, but particularly Example 1, further comprising determining the actual orientation by determining an angle of rotation of the at least one body portion relative to a gravitational ground plane about a yaw axis.
[00104] Example 25. The physical orientation system of any one of the preceding examples, but particularly Example 1, further comprising determining the reference orientation by determining and compositing a rotation of the at least one body portion about a yaw axis and one or both of: a pitch axis or a roll axis, relative to a gravity vector.
[00105] Example 26. The physical orientation system of any one of the preceding examples, but particularly Example 1, further comprising determining the actual orientation by determining and compositing a rotation of the at least one body portion about a yaw axis and one or both of: a pitch axis or a roll axis, relative to a gravity vector.
[00106] Example 27. The physical orientation system of any one of the preceding examples, but particularly Example 1, further comprising individualizing the angular offset for the user. [00107] Example 28. The physical orientation system of any one of the preceding examples, but particularly Example 27, wherein the individualization comprises receiving a statistical model of the user; and updating the angular offset, and thereby the actual orientation, based on the statistical model.
[00108] Example 29. A computer-implemented method for guiding physical orientation of a user for aiming a sports projectile at a destination, comprising: receiving a sensor signal indicative of a reference orientation of the user relative to a target destination; receiving one or more additional sensor signals indicative of an actual orientation of the user in space, relative to the reference orientation; comparing the actual orientation to the reference orientation to identify an angular offset of the actual orientation relative to the reference orientation; and when the angular offset is within a predefined range, outputting a positive alignment indication, and when the angular offset is outside of the predefined range, outputting a negative alignment indication. [00109] Example 30. The computer-implemented method of any one of the preceding examples, but particularly Example 30, further comprising receiving an input to set the reference orientation.
[00110] Example 31. The computer-implemented method of any one of the preceding examples, but particularly Example 30, wherein the target destination is a landing location for a projectile launched by the user.
[00111] Example 32. The computer-implemented method of any one of the preceding examples, but particularly Example 30, wherein the sensor module is wearable on the at least one body portion.
[00112] Example 33. The computer-implemented method of any one of the preceding examples, but particularly Example 33, wherein the at least one body portion of the user is a lower extremity of the user.
[00113] Example 34. The computer-implemented method of any one of the preceding examples, but particularly Example 33, wherein the at least one body portion of the user is a foot of the user.
[00114] Example 35. The computer-implemented method of any one of the preceding examples, but particularly Example 33, wherein the at least one body portion of the user is a leg of the user.
[00115] Example 36. The computer-implemented method of any one of the preceding examples, but particularly Example 33, wherein the at least one body portion of the user is a torso of the user.
[00116] Example 37. The computer-implemented method of any one of the preceding examples, but particularly Example 30, wherein the receiving the one or more additional sensor signals comprises substantially continuously receiving the one or more additional sensor signals until the angular offset is within the predefined range.
[00117] Example 38. The computer-implemented method of any one of the preceding examples, but particularly Example 30, wherein the receiving the one or more additional sensor signals comprises receiving a composite sensor signal comprising a first sensor signal and a second sensor signal from the sensor module and a second sensor module, respectively. [00118] Example 39. The computer-implemented method of any one of the preceding examples, but particularly Example 38, further comprising a first device comprising the sensor module and a second device comprising a second sensor module, wherein one or both of the first device or the second device are configured for radio frequency ranging, ultrasonic ranging, infrared ranging, or global position system (GPS) positioning.
[00119] Example 40. The computer-implemented method of any one of the preceding examples, but particularly Example 38, wherein one or both of the first device or the second device comprise an inertial measurement unit (IMU).
[00120] Example 41. The computer-implemented method of any one of the preceding examples, but particularly Example 37, wherein the sensor module is worn on a first body portion and the second sensor module is worn on a second body portion.
[00121] Example 42. The computer-implemented method of any one of the preceding examples, but particularly Example 41, wherein the first body portion is a left extremity, and the second body portion is a right extremity.
[00122] Example 43. The computer-implemented method of any one of the preceding examples, but particularly Example 29, wherein the reference orientation comprises a line aligned with the target destination.
[00123] Example 44. The computer-implemented method of any one of the preceding examples, but particularly Example 43, wherein the actual orientation comprises a line nonparallel relative to the reference orientation, indicating a current alignment of the user relative to the reference orientation.
[00124] Example 45. The computer-implemented method of any one of the preceding examples, but particularly Example 29, further comprising receiving an input of a user preference, wherein the user preference is configured to adjust a notification type of one or both of: the positive alignment indication or the negative alignment indication.
[00125] Example 46. The computer-implemented method of any one of the preceding examples, but particularly Example 29, wherein the sensor module comprises at least one of: an accelerometer, a gyroscope, or a magnetometer.
[00126] Example 47. The computer-implemented method of any one of the preceding examples, but particularly Example 29, wherein the positive alignment indication is one of: visual feedback, auditory feedback, haptic feedback, or a combination thereof.
[00127] Example 48. The computer-implemented method of any one of the preceding examples, but particularly Example 29, wherein the negative alignment indication is one of: visual feedback, auditory feedback, haptic feedback, or a combination thereof.
[00128] Example 49. The computer-implemented method of any one of the preceding examples, but particularly Example 29, wherein the sensor module is coupled to the at least one body portion of the user, and the processor and the memory are in a computing device operatively coupled to the sensor module.
[00129] Example 50. The computer-implemented method of any one of the preceding examples, but particularly Example 30, wherein the sensor module, the processor, and the memory are in a wearable device coupled to the at least one body portion of the user.
[00130] Example 51. The computer-implemented method of any one of the preceding examples, but particularly Example 29, further comprising determining the reference orientation by determining an angle of rotation of the at least one body portion relative to a gravitational ground plane about a yaw axis.
[00131] Example 52. The computer-implemented method of any one of the preceding examples, but particularly Example 29, further comprising determining the actual orientation by determining an angle of rotation of the at least one body portion relative to a gravitational ground plane about a yaw axis.
[00132] Example 53. The computer-implemented method of any one of the preceding examples, but particularly Example 29, further comprising determining the reference orientation by determining and compositing a rotation of the at least one body portion about a yaw axis and one or both of: a pitch axis or a roll axis, relative to a gravity vector.
[00133] Example 54. The computer-implemented method of any one of the preceding examples, but particularly Example 29, further comprising determining the actual orientation by determining and compositing a rotation of the at least one body portion about a yaw axis and one or both of: a pitch axis or a roll axis, relative to a gravity vector.
[00134] Example 55. The computer-implemented method of any one of the preceding examples, but particularly Example 29, further comprising individualizing the angular offset for the user.
[00135] Example 56. The computer-implemented method of any one of the preceding examples, but particularly Example 55, wherein the individualization comprises receiving a statistical model of the user; and updating the angular offset, and thereby the actual orientation, based on the statistical model.
[00136] Example 57. A non-transitory computer readable medium configured to store computer readable instructions that, when read by a processor, cause the processor to execute operations for guiding physical orientation of a user for aiming a sports projectile at a destination, the operations comprising: receiving a sensor signal indicative of a reference orientation of the user relative to a target destination; receiving one or more additional sensor signals indicative of an actual orientation of the user in space, relative to the reference orientation; comparing the actual orientation to the reference orientation to identify an angular offset of the actual orientation relative to the reference orientation; and when the angular offset is within a predefined range, outputting a positive alignment indication, and when the angular offset is outside of the predefined range, outputting a negative alignment indication.
[00137] Example 58. The non-transitory computer readable medium of any one of the preceding examples, but particularly Example 57, further comprising receiving an input to set the reference orientation.
[00138] Example 59. The non-transitory computer readable medium of any one of the preceding examples, but particularly Example 57, wherein the target destination is a landing location for a projectile launched by the user.
[00139] Example 60. The non-transitory computer readable medium of any one of the preceding examples, but particularly Example 57, wherein the sensor module is wearable on the at least one body portion.
[00140] Example 61. The non-transitory computer readable medium of any one of the preceding examples, but particularly Example 60, wherein the at least one body portion of the user is a lower extremity of the user.
[00141] Example 62. The non-transitory computer readable medium of any one of the preceding examples, but particularly Example 60, wherein the at least one body portion of the user is a foot of the user.
[00142] Example 63. The non-transitory computer readable medium of any one of the preceding examples, but particularly Example 60, wherein the at least one body portion of the user is a leg of the user.
[00143] Example 64. The non-transitory computer readable medium of any one of the preceding examples, but particularly Example 60, wherein the at least one body portion of the user is a torso of the user.
[00144] Example 65. The non-transitory computer readable medium of any one of the preceding examples, but particularly Example 57, wherein the receiving the one or more additional sensor signals comprises substantially continuously receiving the one or more additional sensor signals until the angular offset is within the predefined range.
[00145] Example 66. The non-transitory computer readable medium of any one of the preceding examples, but particularly Example 57, wherein the receiving the one or more additional sensor signals comprises receiving a composite sensor signal comprising a first sensor signal and a second sensor signal from the sensor module and a second sensor module, respectively.
[00146] Example 67. The non-transitory computer readable medium of any one of the preceding examples, but particularly Example 66, further comprising a first device comprising the sensor module and a second device comprising a second sensor module, wherein one or both of the first device or the second device are configured for radio frequency ranging, ultrasonic ranging, infrared ranging, or global position system (GPS) positioning.
[00147] Example 68. The non-transitory computer readable medium of any one of the preceding examples, but particularly Example 66, wherein one or both of the first device or the second device comprise an inertial measurement unit (IMU).
[00148] Example 69. The non-transitory computer readable medium of any one of the preceding examples, but particularly Example 65, wherein the sensor module is worn on a first body portion and the second sensor module is worn on a second body portion.
[00149] Example 70. The non-transitory computer readable medium of any one of the preceding examples, but particularly Example 69, wherein the first body portion is a left extremity, and the second body portion is a right extremity.
[00150] Example 71. The non-transitory computer readable medium of any one of the preceding examples, but particularly Example 57, wherein the reference orientation comprises a line aligned with the target destination.
[00151] Example 72. The non-transitory computer readable medium of any one of the preceding examples, but particularly Example 71, wherein the actual orientation comprises a line nonparallel relative to the reference orientation, indicating a current alignment of the user relative to the reference orientation.
[00152] Example 73. The non-transitory computer readable medium of any one of the preceding examples, but particularly Example 57, further comprising receiving an input of a user preference, wherein the user preference is configured to adjust a notification type of one or both of the positive alignment indication or the negative alignment indication.
[00153] Example 74. The non-transitory computer readable medium of any one of the preceding examples, but particularly Example 57, wherein the sensor module comprises at least one of an accelerometer, a gyroscope, or a magnetometer. [00154] Example 75. The non-transitory computer readable medium of any one of the preceding examples, but particularly Example 57, wherein the positive alignment indication is one of: visual feedback, auditory feedback, haptic feedback, or a combination thereof. [00155] Example 76. The non-transitory computer readable medium of any one of the preceding examples, but particularly Example 57, wherein the negative alignment indication is one of: visual feedback, auditory feedback, haptic feedback, or a combination thereof. [00156] Example 77. The non-transitory computer readable medium of any one of the preceding examples, but particularly Example 57, wherein the sensor module is coupled to the at least one body portion of the user, and the processor and the memory are in a computing device operatively coupled to the sensor module.
[00157] Example 78. The non-transitory computer readable medium of any one of the preceding examples, but particularly Example 57, wherein the sensor module, the processor, and the memory are in a wearable device coupled to the at least one body portion of the user. [00158] Example 79. The non-transitory computer readable medium of any one of the preceding examples, but particularly Example 57, further comprising determining the reference orientation by determining an angle of rotation of the at least one body portion relative to a gravitational ground plane about a yaw axis.
[00159] Example 80. The non-transitory computer readable medium of any one of the preceding examples, but particularly Example 57, further comprising determining the actual orientation by determining an angle of rotation of the at least one body portion relative to a gravitational ground plane about a yaw axis.
[00160] Example 81. The non-transitory computer readable medium of any one of the preceding examples, but particularly Example 57, further comprising determining the reference orientation by determining and compositing a rotation of the at least one body portion about a yaw axis and one or both of: a pitch axis or a roll axis, relative to a gravity vector.
[00161] Example 82. The non-transitory computer readable medium of any one of the preceding examples, but particularly Example 57, further comprising determining the actual orientation by determining and compositing a rotation of the at least one body portion about a yaw axis and one or both of: a pitch axis or a roll axis, relative to a gravity vector.
[00162] Example 83. The non-transitory computer readable medium of any one of the preceding examples, but particularly Example 57, further comprising individualizing the angular offset for the user. [00163] Example 84. The non-transitory computer readable medium of any one of the preceding examples, but particularly Example 83, wherein the individualization comprises receiving a statistical model of the user; and updating the angular offset, and thereby the actual orientation, based on the statistical model.
[00164] The systems and methods of the preferred embodiment and variations thereof can be embodied and/or implemented at least in part as a machine configured to receive a computer- readable medium storing computer-readable instructions. The instructions are preferably executed by computer-executable components preferably integrated with the system and one or more portions of the processor on a physiological orientation device, and/or computing device. The computer-readable medium can be stored on any suitable computer-readable media such as RAMs, ROMs, flash memory, EEPROMs, optical devices (e.g., CD or DVD), hard drives, floppy drives, or any suitable device. The computer-executable component is preferably a general or application-specific processor, but any suitable dedicated hardware or hardware/firmware combination can alternatively or additionally execute the instructions.
[00165] References in the specification to “one embodiment,” “an embodiment,” “an illustrative embodiment,” “some embodiments,” etc., indicate that the embodiment described may include a particular feature, structure, or characteristic, but every embodiment may or may not necessarily include that particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same embodiment. Further, when a particular feature, structure, or characteristic is described in connection with an embodiment, it is submitted that it is within the knowledge of one skilled in the art to effect such feature, structure, or characteristic in connection with other embodiments whether or not explicitly described.
[00166] As used in the description and claims, the singular form “a”, “an” and “the” include both singular and plural references unless the context clearly dictates otherwise. For example, the term “device” may include, and is contemplated to include, a plurality of devices, or the term “orientation” may include, and is contemplated to include, a plurality of orientations. At times, the claims and disclosure may include terms such as “a plurality,” “one or more,” or “at least one;” however, the absence of such terms is not intended to mean, and should not be interpreted to mean, that a plurality is not conceived.
[00167] The term “about” or “approximately,” when used before a numerical designation or range (e.g., to define a length or pressure), indicates approximations which may vary by ( + ) or ( - ) 5%, 1% or 0.1%. All numerical ranges provided herein are inclusive of the stated start and end numbers. The term “substantially” indicates mostly (i.e., greater than 50%) or essentially all of a device, substance, or composition.
[00168] As used herein, the term “comprising” or “comprises” is intended to mean that the devices, systems, and methods include the recited elements, and may additionally include any other elements. “Consisting essentially of’ shall mean that the devices, systems, and methods include the recited elements and exclude other elements of essential significance to the combination for the stated purpose. Thus, a system or method consisting essentially of the elements as defined herein would not exclude other materials, features, or steps that do not materially affect the basic and novel characteristic(s) of the claimed disclosure. “Consisting of’ shall mean that the devices, systems, and methods include the recited elements and exclude anything more than a trivial or inconsequential element or step. Embodiments defined by each of these transitional terms are within the scope of this disclosure.
[00169] The examples and illustrations included herein show, by way of illustration and not of limitation, specific embodiments in which the subject matter may be practiced. Other embodiments may be utilized and derived therefrom, such that structural and logical substitutions and changes may be made without departing from the scope of this disclosure. Such embodiments of the inventive subject matter may be referred to herein individually or collectively by the term “invention” merely for convenience and without intending to voluntarily limit the scope of this application to any single invention or inventive concept, if more than one is in fact disclosed. Thus, although specific embodiments have been illustrated and described herein, any arrangement calculated to achieve the same purpose may be substituted for the specific embodiments shown. This disclosure is intended to cover any and all adaptations or variations of various embodiments. Combinations of the above embodiments, and other embodiments not specifically described herein, will be apparent to those of skill in the art upon reviewing the above description.

Claims

CLAIMS WHAT IS CLAIMED IS:
1. A physical orientation system for guiding a user for aiming a sports projectile at a destination, the system comprising: a sensor module configured to detect a physical orientation of at least one body portion of the user; a processor operatively coupled to the sensor module; and a memory operatively coupled to the processor, the memory configured to store instructions that, when executed by the processor, cause the processor to: receive a sensor signal indicative of a reference orientation of the user relative to a target destination; receive one or more additional sensor signals indicative of an actual orientation of the user in space, relative to the reference orientation; compare the actual orientation to the reference orientation to identify an angular offset of the actual orientation relative to the reference orientation; and when the angular offset is within a predefined range, output a positive alignment indication, and when the angular offset is outside of the predefined range, output a negative alignment indication.
2. The physical orientation system of claim 1, further comprising determining the reference orientation by determining an angle of rotation of the at least one body portion relative to a gravitational ground plane about a yaw axis.
3. The physical orientation system of any one of the preceding claims, further comprising determining the actual orientation by determining an angle of rotation of the at least one body portion relative to a gravitational ground plane about a yaw axis.
4. The physical orientation system of any one of the preceding claims, further comprising determining the reference orientation by determining and compositing a rotation of the at least one body portion about a yaw axis and one or both of: a pitch axis or a roll axis, relative to a gravity vector.
5. The physical orientation system of any one of the preceding claims, further comprising determining the actual orientation by determining and compositing a rotation of the at least one body portion about a yaw axis and one or both of: a pitch axis or a roll axis, relative to a gravity vector.
6. The physical orientation system of any one of the preceding claims, further comprising individualizing the angular offset for the user.
7. The physical orientation system of claim 6, wherein the individualization comprises receiving a statistical model of the user; and updating the angular offset, and thereby the actual orientation, based on the statistical model.
8. The physical orientation system of any one of the preceding claims, further comprising receiving an input to set the reference orientation.
9. The physical orientation system of any one of the preceding claims, wherein the target destination is a landing location for a projectile launched by the user.
10. The physical orientation system of any one of the preceding claims, wherein the sensor module is wearable on the at least one body portion.
11. The physical orientation system of any one of the preceding claims, wherein the at least one body portion of the user is a lower extremity of the user.
12. The physical orientation system of any one of the preceding claims, wherein the at least one body portion of the user is a foot of the user.
13. The physical orientation system of any one of the preceding claims, wherein the at least one body portion of the user is a leg of the user.
14. The physical orientation system of any one of the preceding claims, wherein the at least one body portion of the user is a torso of the user.
15. The physical orientation system of any one of the preceding claims, wherein the receiving the one or more additional sensor signals comprises substantially continuously receiving the one or more additional sensor signals until the angular offset is within the predefined range.
16. The physical orientation system of any one of the preceding claims, wherein the receiving the one or more additional sensor signals comprises receiving a composite sensor signal comprising a first sensor signal and a second sensor signal from the sensor module and a second sensor module, respectively.
17. The physical orientation system of any one of the preceding claims, further comprising a first device comprising the sensor module and a second device comprising a second sensor module, wherein one or both of the first device or the second device are configured for radio frequency ranging, ultrasonic ranging, infrared ranging, or global position system (GPS) positioning.
18. The physical orientation system of any of claims 16-17, wherein one or both of the sensor module or the second sensor module comprises an inertial measurement unit (IMU).
19. The physical orientation system of any one of claims 16-18, wherein the sensor module is worn on a first body portion and the second sensor module is worn on a second body portion.
20. The physical orientation system of claim 19, wherein the first body portion is a left extremity, and the second body portion is a right extremity.
21. The physical orientation system of any one of the preceding claims, wherein the reference orientation comprises a line aligned with the target destination.
22. The physical orientation system of any one of the preceding claims, wherein the actual orientation comprises a line nonparallel relative to the reference orientation, indicating a current alignment of the user relative to the reference orientation.
23. The physical orientation system of any one of the preceding claims, further comprising receiving an input of a user preference, wherein the user preference is configured to adjust a notification type of one or both of: the positive alignment indication or the negative alignment indication.
24. The physical orientation system of any one of the preceding claims, wherein the sensor module comprises at least one of: an accelerometer, a gyroscope, or a magnetometer.
25. The physical orientation system of any one of the preceding claims, wherein the positive alignment indication is one of: visual feedback, auditory feedback, haptic feedback, or a combination thereof.
26. The physical orientation system of any one of the preceding claims, wherein the negative alignment indication is one of: visual feedback, auditory feedback, haptic feedback, or a combination thereof.
27. The physical orientation system of any one of the preceding claims, wherein the sensor module is coupled to the at least one body portion of the user, and the processor and the memory are in a computing device operatively coupled to the sensor module.
28. The physical orientation system of any one of the preceding claims, wherein the sensor module, the processor, and the memory are in a wearable device coupled to the at least one body portion of the user.
PCT/US2025/012712 2024-01-24 2025-01-23 Guiding a physical orientation of a user during sports play Pending WO2025160251A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202463624726P 2024-01-24 2024-01-24
US63/624,726 2024-01-24

Publications (1)

Publication Number Publication Date
WO2025160251A1 true WO2025160251A1 (en) 2025-07-31

Family

ID=96545734

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2025/012712 Pending WO2025160251A1 (en) 2024-01-24 2025-01-23 Guiding a physical orientation of a user during sports play

Country Status (1)

Country Link
WO (1) WO2025160251A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130274635A1 (en) * 2012-04-13 2013-10-17 Adidas Ag Athletic Activity Monitoring Methods and Systems
US20200004322A1 (en) * 2017-03-07 2020-01-02 Vgolf Llc Mixed-Reality Golf Tracking and Simulation
US20200179753A1 (en) * 2018-06-28 2020-06-11 West & Bergh Holding Ab Real time golf swing training aid

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130274635A1 (en) * 2012-04-13 2013-10-17 Adidas Ag Athletic Activity Monitoring Methods and Systems
US20200004322A1 (en) * 2017-03-07 2020-01-02 Vgolf Llc Mixed-Reality Golf Tracking and Simulation
US20200179753A1 (en) * 2018-06-28 2020-06-11 West & Bergh Holding Ab Real time golf swing training aid

Similar Documents

Publication Publication Date Title
US11150071B2 (en) Methods of determining performance information for individuals and sports objects
US20240408448A1 (en) Sport ball athletic activity monitoring methods and systems
US10715759B2 (en) Athletic activity heads up display systems and methods
US12029941B2 (en) Integrated sports training
JP6720594B2 (en) System, method and apparatus for monitoring sports equipment and its users
JP6466053B2 (en) Wearable motion monitor system
US10520378B1 (en) Wearable user input device and sensors system to detect injury
JP6234928B2 (en) System and method for detecting a user-dependent state of a sports item
US20110054782A1 (en) Method and apparatus of measuring and analyzing user movement
US20150328516A1 (en) Sports ball athletic activity monitoring methods and systems
US20160275805A1 (en) Wearable sensors with heads-up display
KR102712457B1 (en) Sports training aid with motion detector
US20150202517A1 (en) Method and system for tracking scores made by a player
US10603567B2 (en) Alignment aid for a golfer
US20220305335A1 (en) Golf Swing Analysis System
KR20100089152A (en) Belt clip style golf swing motion tracking and evaluation device
WO2025160251A1 (en) Guiding a physical orientation of a user during sports play
AU2015246642B2 (en) Sports throwing measurement
KR102000151B1 (en) System for analyzing golf swing pattern by using wearable device
US20240269527A1 (en) Remote lesson system
WO2019043526A1 (en) System and method for analysing sports-related performance

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 25745690

Country of ref document: EP

Kind code of ref document: A1