US20200335222A1 - Movement feedback for orthopedic patient - Google Patents
Movement feedback for orthopedic patient Download PDFInfo
- Publication number
- US20200335222A1 US20200335222A1 US16/851,606 US202016851606A US2020335222A1 US 20200335222 A1 US20200335222 A1 US 20200335222A1 US 202016851606 A US202016851606 A US 202016851606A US 2020335222 A1 US2020335222 A1 US 2020335222A1
- Authority
- US
- United States
- Prior art keywords
- patient
- motion
- task
- gait
- pain
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6802—Sensor mounted on worn items
- A61B5/681—Wristwatch-type devices
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/70—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for mining of medical data, e.g. analysing previous cases of other patients
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
- A61B5/112—Gait analysis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
- A61B5/1121—Determining geometric values, e.g. centre of rotation or angular range of movement
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
- A61B5/1124—Determining motor skills
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
- A61B5/1126—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb using a particular sensing technique
- A61B5/1127—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb using a particular sensing technique using markers
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
- A61B5/1126—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb using a particular sensing technique
- A61B5/1128—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb using a particular sensing technique using image analysis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/48—Other medical applications
- A61B5/4824—Touch or pain perception evaluation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6813—Specially adapted to be attached to a specific body part
- A61B5/6822—Neck
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient; User input means
- A61B5/7405—Details of notification to user or communication with user or patient; User input means using sound
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient; User input means
- A61B5/742—Details of notification to user or communication with user or patient; User input means using visual displays
- A61B5/744—Displaying an avatar, e.g. an animated cartoon character
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient; User input means
- A61B5/742—Details of notification to user or communication with user or patient; User input means using visual displays
- A61B5/7445—Display arrangements, e.g. multiple display units
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient; User input means
- A61B5/7455—Details of notification to user or communication with user or patient; User input means characterised by tactile indication, e.g. vibration or electrical stimulation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient; User input means
- A61B5/746—Alarms related to a physiological condition, e.g. details of setting alarm thresholds or avoiding false alarms
-
- G06K9/00302—
-
- G06K9/00348—
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
- G06T7/0014—Biomedical image inspection using an image reference approach
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
- G06T7/251—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving models
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/174—Facial expression recognition
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/23—Recognition of whole body movements, e.g. for sport training
- G06V40/25—Recognition of walking or running movements, e.g. gait recognition
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/28—Recognition of hand or arm movements, e.g. recognition of deaf sign language
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H15/00—ICT specially adapted for medical reports, e.g. generation or transmission thereof
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H20/00—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
- G16H20/30—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/60—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
- G16H40/63—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/60—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
- G16H40/67—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/20—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/30—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2562/00—Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
- A61B2562/02—Details of sensors specially adapted for in-vivo measurements
- A61B2562/0219—Inertial sensors, e.g. accelerometers, gyroscopes, tilt switches
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording for evaluating the cardiovascular system, e.g. pulse, heart rate, blood pressure or blood flow
- A61B5/024—Measuring pulse rate or heart rate
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30204—Marker
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30221—Sports video; Sports image
Definitions
- Orthopedic patient care may require surgical intervention, such as for upper extremities (e.g., a shoulder or elbow), knee, hip, etc.
- upper extremities e.g., a shoulder or elbow
- Postoperative care may include immobility of a joint ranging from weeks to months, physical therapy, or occupational therapy. Immobilization within the upper extremity may lead to long term issues, such as “Frozen shoulder” where a shoulder capsule thickens and becomes stiff and tight.
- Physical therapy or occupational therapy may be used to help the patient with recovering strength, everyday functioning, and healing. Current techniques involving immobility, physical therapy, or occupational therapy may not monitor or adequately assess range of motion or for pain before or after surgical intervention.
- FIG. 1 illustrates an upper extremity monitoring system in accordance with at least one example of this disclosure.
- FIG. 2 illustrates a device for running a mobile application for upper extremity patient care in accordance with at least one example of this disclosure.
- FIG. 3 illustrates a user interface for use with a mobile application for upper extremity patient care in accordance with at least one example of this disclosure.
- FIG. 4 illustrates an example range of motion image or video user interface component in accordance with at least one example of this disclosure.
- FIG. 5 illustrates a flowchart showing a technique for using a mobile application for upper extremity patient care in accordance with at least one example of this disclosure.
- FIG. 6 illustrates a block diagram of an example machine upon which any one or more of the techniques discussed herein may perform in accordance with at least one example of this disclosure.
- FIG. 7 illustrates a diagram showing markers on a person for training a machine learning model in accordance with at least one example of this disclosure.
- FIG. 8 illustrates an example convolutional neural network in accordance with at least one example of this disclosure.
- FIG. 9 illustrates an example of reinforcement learning for a neural network in accordance with at least one example of this disclosure.
- FIG. 10 illustrates various example user interfaces for use with a user needing or wearing a sling in accordance with at least one example of this disclosure.
- FIG. 11 illustrates a flowchart showing a technique for generating a skeletal model in accordance with at least one example of this disclosure.
- FIG. 12 illustrates a flowchart showing a technique for determining a gait type of a patient in accordance with at least one example of this disclosure.
- FIG. 13 illustrates a flowchart showing a technique for analyzing movement of an orthopedic patient in accordance with at least one example of this disclosure.
- images may be captured of a patient in motion attempting to perform a task, for example after completion of an orthopedic surgery on the patient.
- the images may be analyzed to generate a movement metric of the patient corresponding to the task.
- the movement metric may be compared to a baseline metric (e.g., an average metric or a previous patient metric) for the task.
- An indication of the comparison may be presented, for example including a qualitative result of the comparison.
- Systems and methods described herein may be used to provide, assess, or augment orthopedic patient care (e.g., upper extremity, hip, knee, etc.). These systems and methods may include pain or range of motion assessment of an upper extremity, providing feedback or information to a patient regarding an upper extremity, or augmenting patient care with physical therapy, occupational therapy, warnings, or the like for an upper extremity.
- orthopedic patient care e.g., upper extremity, hip, knee, etc.
- Upper extremity procedures may include, measuring motion in more than just one plane—e.g., adduction, which may be unlike large joint (hip & knee arthroplasty).
- upper extremities may include elbow or shoulder.
- Elbow may be more of a trauma procedure and relatively rare compared to shoulder.
- there may be an opportunity to monitor the patient and provide feedback prior to the decision on surgical intervention.
- There also may be a range of surgical interventions of varying invasiveness/significance leading up to total shoulder replacement.
- Hip and knee replacements or orthopedic procedures may result in changes to gait, range of motion, or pain.
- Postoperative care may include physical or occupational therapy.
- the systems and methods described herein provide analysis of postoperative care of an orthopedic patient.
- FIG. 1 illustrates an upper extremity monitoring system 100 in accordance with at least one example of this disclosure.
- the system 100 includes a first wrist-worn device 102 (e.g., a smart watch), and optionally includes a second wrist-worn device 104 or a neck-worn device 106 .
- a first wrist-worn device 102 e.g., a smart watch
- a second wrist-worn device 104 e.g., a neck-worn device 106 .
- One or more of the devices 102 - 106 may be used as a standalone device, or may work in communication with a mobile phone or with one or more of the other devices 102 - 106 .
- One or more of the devices 102 - 106 may be used to gathers data on steps, floors, gait, or the like. One or more of the devices 102 - 106 may be used to delivers notification to a user.
- one or more of the devices 102 - 106 may be used to capture range of motion data or movement data, such as shoulder movement or range of motion data (e.g., adduction/abduction, flexion/extension, internal/external rotation), elbow movement or range of motion data (e.g., flexion/extension), wrist movement or range of motion data (e.g., pronation/supination), or the like.
- shoulder movement or range of motion data e.g., adduction/abduction, flexion/extension, internal/external rotation
- elbow movement or range of motion data e.g., flexion/extension
- wrist movement or range of motion data e.g., pronation/supination
- ROM range of motion
- one or more of the devices 102 - 106 may be used to extrapolate elbow information, for example, when a second sensor or device is not used near the elbow.
- one or more of the devices 102 - 106 may be used near the elbow to detect elbow movement, pain, or range of motion data.
- Having a picture or animation of the range of motion may be presented on one or more of the devices 102 - 106 or on a mobile phone.
- an animation may be sent as a small file format to illustrate the motion on one or more of the devices 102 - 106 .
- the user may be presented with the path of motion on one or more of the devices 102 - 106 .
- one or more of the devices 102 - 106 may include a bigger, more comfortable watch band than typical for wrist-worn watches.
- FIG. 2 illustrates a device 202 for running a mobile application for upper extremity patient care in accordance with at least one example of this disclosure.
- the device 202 may be in communication with a phone or other mobile device 204 .
- the device 202 may be a smart device, such as a smart watch, or other device with a user interface.
- the device 202 includes a processor, memory, a display (e.g., for displaying a user interface), a transceiver (e.g., for communicating with the mobile device 204 ), and optionally an inertial measurement unit (IMU), a gyroscope, or an accelerometer.
- the device 202 may be used for advanced data collection.
- the device 202 may be used to measure stress response (e.g., heart rate) for example, as a proxy for pain during arm motions. These heartrate spikes may be tied to an animation to visualize the pain on the model.
- stress response e.g., heart rate
- the device 202 may be used for “Sling management”—for example, by taking a picture of a sling worn by a patient.
- the device 202 may be used to analyze an angle of selfies taken using phone camera to obtain data regarding shoulder flexibility.
- the device 202 may be used to measure movement in sleep.
- the device 202 may be used for determining a maximum range of motion. For example, the device 202 may be used to capture max range of motion from day to day activities (e.g., rather than evaluation setting in a hospital/clinical setting).
- the device 202 may be used to look for max movement of the shoulder.
- the device 202 may be used to recognize rotation of the joint.
- the device prompts on the device 202 may be used to obtain a user's context—for example, is the user using a 1 lb weight, a 2 lb weight, etc.
- reliable mobile device 204 position may be used as another sensor, for example to track movements.
- the device 202 may be used to derive data from combining the accelerometer with an altimeter—certain motions may only occur from certain positions.
- the device 202 may be used to compare movement information to a range of expectations—for example, the gyroscope on the device 202 may be used to compare the current data to a range of expectations.
- the device 202 may be used to obtain a range of expectations from a non-operative arm—for example, baseline assessments on a good shoulder of a patient.
- the device 202 may be used to develop a baseline from data coming from healthy users.
- the device 202 may be used to collect data to improve implant products and techniques.
- Advanced feedback may be generated, for example, by identifying potentially harmful movements—too active, movement is unclean, deviating from plane while exercising, exceeding recommended ROM, etc.
- the device 202 may provide haptic feedback or audio to alert the patient.
- the alert or feedback may include a warning, such as an indication that the user is sleeping on the wrong side.
- the device 202 may be used to provide an alarm and wake the patient up.
- the alert or feedback may include a warning for repetitive stress (e.g., a pitch count for an arm).
- the device 202 and the mobile device 204 may be used in combination to provide a target in augmented reality.
- the device 202 and the mobile device 204 may be used in combination, for example using a camera of the mobile device 204 to record at least one repetition of a movement (e.g., automatically captured when the user starts based on sensor data recorded by the device 202 ).
- the recording may be sent to a physical therapist or doctor to review.
- FIG. 3 illustrates a user interface for use with a mobile application for upper extremity patient care in accordance with at least one example of this disclosure.
- the user interface may be displayed on a device 302 , such as a smart watch.
- the device 302 may show an animation, an image, a view of the user, or the like.
- the device 302 may display a picture or animation of a range of motion.
- a user may select an image of a patient and the path of motion may be displayed in the device 302 in response.
- the range of motion and fluidity of movement may be detected using the device 302 .
- a model of dynamics of a user's body may be generated.
- a skeletal model may be generated.
- the model dynamics may be improved by prompting the user with questions, such as “what activity did you just do”, for example on the device 202 or a mobile phone.
- a model may be trained on this data.
- a machine learning model may be trained.
- the model may be used to automatically recognize patient activities (ear, movements, pain, exercises, etc.), such as without requiring the patient to manually log the activity. In an example, daily living activities may be identified automatically.
- the dynamic model may include an avatar, an emoji (e.g., an Animoji—an animal emoji), or the like.
- activities may include Passive Range of Motion (PROM) actions: e.g., put on a coat, pull a coffee mug from the cupboard, etc.
- PROM Passive Range of Motion
- the device 302 may be used for monitoring and alerting a user to awareness of the user's situation (ear, movement, pain, range of motion, fluidity of movement, etc.), compared to other similarly situated users (e.g., by age, condition, etc.) or to an average user.
- the device 302 may be used to inform a user that the user is worse off than an average person with the user's demographics.
- the device 302 may monitor a patient that does not elect to have a procedure (e.g., surgery).
- the device 302 may monitor sleep, because users may wait until they cannot sleep to decide to get a procedure. Lack of sleep may be used as a notification trigger, and a notification may be displayed on the device 302 .
- the device 302 may track motion of a user, and inform the user of results, such as “you are at 50% ability compared to an average person in your demographic.”
- the device 302 may provide a list of activities the patient either cannot do or has not done.
- the device 302 may provide activity specific guidance, for example “we have data collected on tennis activity—[rehab or surgery] will allow you to play tennis better.”
- the device 302 may provide a high-level assessment, for example “you would be a candidate for X surgery,” “people of your profile got a surgical intervention,” or the like.
- the device 302 may provide objective data, for example that the patient is in excessive pain or has poor ROM, which may motivate the patient to seek a surgery election.
- the device 302 may provide a list of doctors.
- the device 302 may provide a notification (e.g., related to lack of sleep, motion of a user, regarding objective data, or the like) to a user or a member of a care team that has the user as a patient (e.g., to a surgeon, a physical therapist, a nurse, etc.).
- the notification may be used by the member of the care team to prompt discussion with the patient regarding surgical intervention (e.g., a notification may include information indicating a surgical intervention is required or suggested.).
- the notification to the member of the care team may include an assessment of the patient automatically or may be stored for future access by a member of the care team with proper access credentials.
- the notification may include results collected by the device 302 .
- the device 302 may provide telemedicine solutions. For example, a nurse practitioner may give virtual appointments through the device 302 . The device 302 may link patients to a surgeon in their area. The device 302 may reduce surgeon communication burden, such as by providing proactive positive reinforcement that rehab is going well if that is what the data indicates. For example, “This is Dr. Wilson's office—everything looks to be going great.”
- the device 302 may link to a personal assistance device (e.g., an Amazon Echo device from Amazon of Seattle, Wash. or a Google Home device from Google of Mountain View, Calif.) to provide an audible “good job” at home, perhaps from a recorded familiar voice (e.g., a nurse at the doctor's office).
- the device 302 may connect with another display device to provide efficient engagement, for example a hologram or recording of the doctor.
- the device 302 may route initial calls to a call center (e.g., instead of the doctor's office).
- the device 302 may play video messages to patient from surgeon that are keyed to customized thresholds for range of motion, for example “Range of motion looks good,” “Range of motion needs work,” or the like.
- the device 302 may provide early infection detection.
- the device 302 may communicate with a small patch that can detect temperature via embedded temperature sensor.
- a temperature change may be sign of infection, and may be sent to the device 302 , which may alert the user or a doctor.
- the device 302 may be used to monitor compliance with a prescribed prehab, rehab, or therapy program.
- a reward may be provided for compliance, for example a monetary reward from insurance company for compliance with rehab program.
- FIG. 4 illustrates an example 400 range of motion image or video user interface component in accordance with at least one example of this disclosure.
- the range of motion example 400 may be shown all at once or as a series of images.
- the range of motion example 400 may include augmented information (e.g., an indication of pain corresponding to a heart-rate spike).
- the heart-rate spike may be detected by a device worn by the user (e.g., a sensor patch, a wrist-worn device), by a video monitoring system (e.g., via a camera of a mobile phone), or the like.
- a video monitoring system may detect manifestations of pain, such as grimaces.
- audio monitoring may occur through a device or phone that may detect audible manifestations of pain.
- Video or audio monitoring may be performed by a home assistant device or service (e.g. Amazon Echo or Google Home), which may be integrated with the devices or systems described herein.
- a home assistant device or service e.g. Amazon Echo or Google Home
- the range of motion example 400 illustrates a video captured of a user.
- the range of motion example 400 illustrates an animation of a motion shown for a user to emulate.
- the range of motion example 400 illustrates a video capture of an activity, performed by a professional (e.g., an actor, a doctor, a physical therapist, etc.).
- FIG. 5 illustrates a flowchart showing a technique 500 for using a mobile application for upper extremity patient care in accordance with at least one example of this disclosure.
- the technique 500 includes an operation 502 to capturing range of motion or fluidity information using a sensor of a wrist-worn device.
- Operation 502 may include extrapolating elbow pain, fluidity, or range of motion.
- Operation 502 may include capturing a maximum range of motion from day to day activities.
- Operation 502 may further include tracking steps taken by the patient, which may include also monitoring or receiving data regarding a maximum movement of the shoulder.
- Operation 502 may include using automatic identification technology to recognize rotation of the joint.
- Operation 502 may include detecting potentially harmful movements, for example too active, movement is unclean, deviating from plane while exercising, exceeding recommended ROM, or the like.
- Operation 502 may include determining that the patient is sleeping on a wrong side, and alerting the patient (e.g., waking the patient up), further comprising determining that a repetitive stress is occurring to the patient, for example like a pitch count.
- the technique 500 includes an operation 504 to providing feedback, for example, including at least one of providing a list of activities the patient either can't do or hasn't done; providing activity specific guidance—“we have data collected on tennis activity—[rehab or surgery] will allow you to play tennis better”; providing a high level assessment—“you would be a candidate for X surgery”; providing proactive positive reinforcement that rehab is going well if that is what the data indicates; providing a video message to patient from surgeon that is keyed to customized thresholds for range of motion: e.g., range of motion looks good or range of motion needs work, using haptic feedback on wrist-worn device or audio output by a speaker of the wrist worn device to alert patient to the potentially harmful movements, or the like.
- providing feedback for example, including at least one of providing a list of activities the patient either can't do or hasn't done; providing activity specific guidance—“we have data collected on tennis activity—[rehab or surgery] will allow you to play tennis better”; providing a high level assessment—“you would
- Operation 504 may include further comprising recognizing patient activities without requiring the patient to manually log what the activity was (e.g., use the model), for example determine what action was performed by the patient, such as PROMS: e.g., put on a coat, pull a coffee mug from the cupboard, or the like.
- the data from an action is logged, stored, and optionally communicated to a care team when the user is a patient.
- the technique 500 includes an optional operation 506 to recommending surgery if needed.
- the technique 500 includes an optional operation 508 to using pre-operative information captured by the wrist-worn device for evaluating post-operative range of motion, fluidity, or pain of patient (e.g., as captured by the wrist-worn device post-operatively).
- the technique 500 includes an operation 510 to measure a stress response (e.g., heart rate).
- the technique 500 includes an operation 512 to identify a pain location based on a position of the wrist-worn device at a time of the stress response.
- the technique 500 includes an optional operation 514 to tie these heartrate spikes to the PDF/animation to visualize the pain on the model.
- the technique 500 may include an operation to analyze the angle of selfies taken using phone camera to get data regarding shoulder flexibility.
- the technique 500 may include an operation to provide a picture or animation of the range of motion—export the animation into a small file format illustrating the motion—click on the patient you can see the path of motion (e.g., in wrist-worn device).
- the technique 500 may include an operation to measure movement during sleep.
- the technique 500 may include an operation to build a skeletal model.
- this operation may include improving the model dynamics by prompting user with questions (e.g., generate a model based on training data); for example by using a gyroscope on the wrist-worn device to compare the current data to a range of expectations; or obtaining a range of expectations from an opposite arm (e.g., baseline assessments on good shoulder); or developing a baseline from data coming from healthy population; or using data collected from user input, wrist-worn device sensors, or the like to improve implant products and techniques.
- questions e.g., generate a model based on training data
- an opposite arm e.g., baseline assessments on good shoulder
- developing a baseline from data coming from healthy population e.g., baseline assessments on good shoulder
- FIG. 6 illustrates a block diagram of an example machine 600 upon which any one or more of the techniques discussed herein may perform in accordance with some embodiments.
- the machine 600 may operate as a standalone device or may be connected (e.g., networked) to other machines.
- the machine 600 may operate in the capacity of a server machine, a client machine, or both in server-client network environments.
- the machine 600 may act as a peer machine in peer-to-peer (P2P) (or other distributed) network environment.
- P2P peer-to-peer
- the machine 600 may be a personal computer (PC), a tablet PC, a set-top box (STB), a personal digital assistant (PDA), a mobile telephone, a web appliance, a network router, switch or bridge, or any machine capable of executing instructions (sequential or otherwise) that specify actions to be taken by that machine.
- PC personal computer
- PDA personal digital assistant
- STB set-top box
- PDA personal digital assistant
- mobile telephone a web appliance
- network router network router, switch or bridge
- machine any machine capable of executing instructions (sequential or otherwise) that specify actions to be taken by that machine.
- machine shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein, such as cloud computing, software as a service (SaaS), other computer cluster configurations.
- SaaS software as a service
- Machine 600 may include a hardware processor 602 (e.g., a central processing unit (CPU), a graphics processing unit (GPU), a hardware processor core, or any combination thereof), a main memory 604 and a static memory 606 , some or all of which may communicate with each other via an interlink (e.g., bus) 608 .
- the machine 600 may further include a display unit 610 , an alphanumeric input device 612 (e.g., a keyboard), and a user interface (UI) navigation device 614 (e.g., a mouse).
- the display unit 610 , input device 612 and UI navigation device 614 may be a touch screen display.
- the machine 600 may additionally include a storage device (e.g., drive unit) 616 , a signal generation device 618 (e.g., a speaker), a network interface device 620 , and one or more sensors 621 , such as a global positioning system (GPS) sensor, compass, accelerometer, or other sensor.
- the machine 600 may include an output controller 628 , such as a serial (e.g., Universal Serial Bus (USB), parallel, or other wired or wireless (e.g., infrared (IR), near field communication (NFC), etc.) connection to communicate or control one or more peripheral devices (e.g., a printer, card reader, etc.).
- a serial e.g., Universal Serial Bus (USB), parallel, or other wired or wireless (e.g., infrared (IR), near field communication (NFC), etc.) connection to communicate or control one or more peripheral devices (e.g., a printer, card reader, etc.).
- USB Universal Serial Bus
- the storage device 616 may include a machine readable medium 622 on which is stored one or more sets of data structures or instructions 624 (e.g., software) embodying or utilized by any one or more of the techniques or functions described herein.
- the instructions 624 may also reside, completely or at least partially, within the main memory 604 , within static memory 606 , or within the hardware processor 602 during execution thereof by the machine 600 .
- one or any combination of the hardware processor 602 , the main memory 604 , the static memory 606 , or the storage device 616 may constitute machine readable media.
- machine readable medium 622 is illustrated as a single medium, the term “machine readable medium” may include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) configured to store the one or more instructions 624 .
- the term “machine readable medium” may include any medium that is capable of storing, encoding, or carrying instructions for execution by the machine 600 and that cause the machine 600 to perform any one or more of the techniques of the present disclosure, or that is capable of storing, encoding or carrying data structures used by or associated with such instructions.
- Non-limiting machine-readable medium examples may include solid-state memories, and optical and magnetic media.
- the instructions 624 may further be transmitted or received over a communications network 626 using a transmission medium via the network interface device 620 utilizing any one of a number of transfer protocols (e.g., frame relay, internet protocol (IP), transmission control protocol (TCP), user datagram protocol (UDP), hypertext transfer protocol (HTTP), etc.
- transfer protocols e.g., frame relay, internet protocol (IP), transmission control protocol (TCP), user datagram protocol (UDP), hypertext transfer protocol (HTTP), etc.
- Example communication networks may include a local area network (LAN), a wide area network (WAN), a packet data network (e.g., the Internet), mobile telephone networks (e.g., cellular networks), Plain Old Telephone (POTS) networks, and wireless data networks (e.g., Institute of Electrical and Electronics Engineers (IEEE) 802.11 family of standards known as Wi-Fi®, IEEE 802.16 family of standards known as WiMax®), IEEE 802.15.4 family of standards, peer-to-peer (P2P) networks, among others.
- the network interface device 620 may include one or more physical jacks (e.g., Ethernet, coaxial, or phone jacks) or one or more antennas to connect to the communications network 626 .
- the network interface device 620 may include a plurality of antennas to wirelessly communicate using at least one of single-input multiple-output (SIMO), multiple-input multiple-output (MIMO), or multiple-input single-output (MISO) techniques.
- SIMO single-input multiple-output
- MIMO multiple-input multiple-output
- MISO multiple-input single-output
- transmission medium shall be taken to include any intangible medium that is capable of storing, encoding or carrying instructions for execution by the machine 600 , and includes digital or analog communications signals or other intangible medium to facilitate communication of such software.
- Example systems and methods are described below for creating a skeletal model of a user.
- the skeletal model may be generated, in an example, without use of a depth camera or depth sensors or a gait lab.
- the systems and methods may be used to detect movements or a rep count from a video using deep learning techniques by creating a skeletal model using markers. For example, when doing physical therapy or a musculoskeletal movement analysis there may not be an easy or automated way to analyze the movements around joints to check range of motion or rep count. In some examples, it is infeasible to have a user use a depth camera or a gait lab, which may require too much time for each individual.
- the systems and methods described below may be used to train a skeletal model.
- the skeletal model is useful for tracking movement of a user, tracking reps (e.g., for a physical therapy activity or training regimen), tracking time a position is held, monitoring for correct performance of an exercise or movement, or the like.
- the automatic tracking may allow a user to focus on the movement or technique performed without worrying about needing to keep a count, time, or movement in their head.
- the tracked movement, reps, or time may be used for a physical therapy session, such as for rehab or strength training.
- the tracked information is used for rehab after an orthopedic surgery, for example to monitor user progress, provide feedback to the user or a clinician, or the like.
- a trained skeletal model may be used to process images of a user captured by a camera.
- the camera may be a camera of a mobile device (e.g., a cell phone), in an example.
- the camera does not need to be a depth camera or include any additional sensors beyond those of a digital camera.
- FIG. 7 Illustrates a diagram showing markers (e.g., 702 - 712 or 714 ) on a person (e.g., 701 or 703 ) for training a machine learning model in accordance with at least one example of this disclosure.
- the markers e.g., 702 - 712
- the markers may be color coded (e.g., each marker or pair of markers, such as left-right pairs, may have different colors).
- the markers on the person 701 may be black or have arbitrary colors.
- the markers on the person 701 may include items affixed to a person.
- the markers on the person 701 may be arranged in particular locations on the person 701 , such as at joints (e.g., shoulder 704 , neck 702 , elbow 706 , wrist 708 , or the like).
- the markers on the person 703 may include a fabric or suit with a pattern printed or designed on it.
- the marker 714 is an example of an item printed or designed on an item worn by the person 703 .
- the markers on the person 703 may be arranged randomly or arbitrarily, or may be evenly spaced or arranged, but do not necessarily have a particular arrangement. In an example, the more markers, the more accurate for training or for repeated use accuracy.
- the markers on the person 703 may cover the person's entire body (e.g., in the example shown in FIG. 7 ) or may only cover a portion of the body. For example, a leg sheath with markers may be worn by the person 703 , giving coverage of only the leg wearing the leg sheath.
- a leg sheath may be attached via Velcro, a zipper, or affixed onto the person's leg.
- the markers may be worn by a patient or a model user, such as for evaluation or training purposes respectively. Training may be conducted with a user making various movements using the markers, after which a trained model may be generated.
- the trained model may be used without markers, in an example.
- a training set for joint movements is created by using color coded markers at various joints (e.g., markers 702 - 712 on person 701 ).
- the image color markers may be placed on the person 701 to track the movements with a camera (e.g., a camera of a mobile device, such as a mobile phone).
- the color markers may be used as landmarks for training a model as the person 701 moves.
- a video may be captured, which may then be analyzed to determine how the markers are moved for reach prescribed movement.
- a skeletal model may be generated from the determined movement of the person 701 via the markers (e.g., 702 - 712 ).
- a skeletal model may be generated via the markers (e.g., 714 ) on the person 703 using similar techniques.
- the training may be conducted with a plurality of people wearing the markers (e.g., 10-15 people) to generate an accurate model.
- the plurality of people may have diverging body movement, types, sizes, or positioning.
- the skeletal model that is trained using this technique may be applicable to a broad range of people when used for inferencing due to the different people used in training.
- the skeletal model may be updated recursively using later data as testing data (or the later data may be labeled for use as training data, in an example).
- a plurality of skeletal models may be generated, such as one for a particular type of patient (e.g., based on body size, ailment type, such as ankle, knee, shoulder, etc., or the like, or deformity, such as varus or valgus malalignment of the knee).
- the skeletal models described herein may be generated based on joints and change in joint angles.
- a skeletal model may be scaled to match a patient.
- the side of the person 701 may be determined based on using top left corner as origin (0,0), then for example, using a technique such as left to right and top to bottom, with the first marker identified as the right side of the person 701 .
- the markers may be used to later evaluate a patient (e.g., not in a training technique).
- the markers may allow for increased accuracy compared to not using markers.
- the skeletal model generated in a training phase may also be used in conjunction with a patient later wearing markers for increased accuracy.
- a skeletal model may be trained using the markers (e.g., 702 - 712 ) on person 701 , and person 703 may later be a patient wearing markers (e.g., 714 ) on their entire body or a. portion of their body. Movement of the person 703 may be tracked using the markers or the trained skeletal model. Results of the movement tracking may include marker identification or modeled movement based on the skeletal model and captured video. The marker identification and modeled movement may be combined to increase accuracy.
- the person 703 may be tracked while moving, for example by capturing video of the person 703 .
- the video may be analyzed to determine a style of walking, in an example.
- the style may be categorized, for example based on the person's body size, habitus, gait or weight management, demographics, familial influences, or the like.
- the style may include factors, such as heel strike, walking pace, where the person 703 holds their weight, sway, gait type, gait style, foot separation, stride length, or the like.
- the person 703 may be a patient, and their style may be captured before a surgical procedure, after, and optionally continue over time.
- the style before the surgical procedure may inform the surgical procedure, and style after the surgical procedure may be compared to the pre-style, along with pain or discomfort information for example, to determine patient satisfaction and how it relates to the walking style.
- the walking style may be used post-operatively to validate the patient's experience (e.g., informing the patient that the style has changed, which may contribute to a feeling of discomfort, but which may improve patient satisfaction by providing the patient understanding why this change has occurred).
- FIG. 8 illustrates an example convolutional neural network (CNN) 800 in accordance with at least one example of this disclosure.
- the CNN 800 uses an image, for example captured by a camera, to identify a body outline and markers affixed to the body of a user. The locations of the markers relative to the body outline are determined using the CNN 800 .
- the CNN 800 may include a plurality of hidden layers used to make the identifications. In an example, other deep learning or machine learning techniques may be used to perform the identifications.
- the input image 802 which may be output from a camera is fed into the layers of the CNN 800 .
- Each convolutional layer e.g., layers 804 , 806 , 808 , 810 , and 812 may include one or more convolution and pooling layers.
- layer 810 may be a separate max pooling layer.
- Example neural network dimensions are shown for each stage, however, one of ordinary skill with the benefit of the present disclosure will appreciate that other dimensions may be utilized.
- the CNN 800 may be trained using the person 701 of FIG. 7 , by tracking captured images or video including identifying locations of markers affixed to the person 701 .
- the CNN 800 may identify the markers at various joints on the person for obtaining a skeletal model. Once the location of the markers is identified, the CNN 800 may be used to determine a side of the body the markers are placed on, such as by using a distance function. Then the CNN 800 may use the color of the markers to determine where each marker fits in the skeletal model.
- a model for different exercises may be created, which may be used with the skeletal model.
- the CNN 800 may be used after training to analyze movements and output inferences related to the movements, such as a rep count based on an exercise, for example.
- the skeletal model may be used to identify the exercises based on an exercise library, for example, which may be created using test subjects. In an example, when the model does not identify the exercise or does not correctly identify the exercise, reinforcement learning may be used to identify the correct exercise for the CNN 800 .
- the components of the CNN 800 may include receiving an input image (e.g., captured by a mobile device) 802 , and using the input image 802 to identify features at different layers. For example, layer 804 may identify a body outline, layer 806 may identify markers on the body, and layer 808 may identify locations of the markers with respect to the body.
- the max pooling layer 810 may be used to downsample the results from layer 808 . Then layer 812 may output coordinates, locations, or dimensions of markers identified on a body of a person in the input image 802 .
- FIG. 9 illustrates an example of reinforcement learning for a neural network in accordance with at least one example of this disclosure.
- a system 900 of FIG. 9 may be used to perform the reinforcement learning. For example, a state and reward are sent from an environment to an agent, which outputs an action. The action is received by the environment, and iteratively the reinforcement learning is performed by updating state and reward values and sending those to the agent.
- the reinforcement learning system 900 may be used for post processing, for example on already recorded video or images or on real-time video or images, such as to measure an exercise as part of physical therapy or another activity.
- the skeletal model may be generated for example using the CNN 800 of FIG. 8 , based on the markers of FIG. 7 .
- the skeletal model may then be used with images or video captured on a camera, such as a camera of a mobile device (e.g., a phone), without needing or using markers during capture of the exercise.
- the markers be used for training the skeletal model but do not need to be used for generating inferences using the skeletal model.
- markers may be used for inferences, such as to improve accuracy.
- a portion of a patient's body may have markers while performing the exercise (e.g., a leg sheath).
- a patient's movements may be captured, an exercise identified, and a portion of user anatomy tracked, (optionally in real-time) via a camera of a mobile device (e.g., a phone) without markers on the patient (or with markers to improve accuracy) using the skeletal model that was trained using markers.
- a mobile device e.g., a phone
- Further image or video processing may be used (e.g., via the skeletal model) to identify information related to an exercise or movement performed by a patient. For example, a walking style may be identified for the patient. In another example, pain based on facial expressions may be identified during the exercise or movement. The pain may be correlated to a particular movement or moment during the exercise to automatically determine what caused the pain. Other techniques (e.g., via a smart watch or other wearable device) as described herein may be used to track movement or pain in addition to or instead of these techniques.
- FIG. 10 illustrates various example user interfaces (e.g., 1002 - 1008 ) for use with a user needing or wearing a sling in accordance with at least one example of this disclosure.
- the user interfaces of FIG. 10 may be displayed on a wrist-worn device, such as a smart-watch.
- the wrist-worn device may include a sensor, such as an accelerometer, to detect movement of an arm of a person wearing the wrist-worn device.
- An example user may include a patient who has had a surgical procedure on their arm, shoulder, chest, etc., or a patient who injured their arm, shoulder, chest, etc.
- the patient may wear a sling for stability or to prevent injury. Slings are effective, when properly used, at stabilizing patient anatomy, but compliance is often difficult, both for the patient to achieve, and for a clinician to measure or detect.
- a patient may have an orthopedic shoulder procedure and be instructed to wear a sling for a period of time (e.g., a few weeks).
- the user interfaces shown in FIG. 10 illustrate various actions that may be detected using a sensor.
- User interface 1002 illustrates motion detection of the wrist-worn device suggestive of shoulder movement.
- a patient may be instructed to keep their shoulder from moving, and the user interface 1002 may be used to provide feedback to the patient that movement has occurred and a reminder to keep movement to a minimum. This information may be educational or serve as a reminder to the patient.
- the user interface 1004 illustrates an example where range of motion of the shoulder of the patient has exceeded a threshold. In this example, the user interface 1004 warns the patient of the unhelpful movement.
- a patient may be asked to complete assessments of specific movements assigned at set intervals through their clinician's protocol.
- a video may instruct the patient how to complete the assessment.
- Metrics of the patient's performance may be measured during the assessment.
- metrics may include speed (e.g., slow down or speed up movement to demonstrate muscular control and improve accuracy of measurement), plane (e.g., correction of movement or hand positioning to decrease impingement with specific movements and improve accuracy of measurement), smoothness (e.g., provide a metric indicating smoothness through the arc of motion), or compensation (e.g., detection of elbow flexion or scapular compensation with movements).
- Other metrics may include, when the patient has a sling, post-operative sling management, frequency of reaching above a certain height, max ROM during day, arm swing velocity while walking, heart rate during motion (e.g., for pain response), sleep duration or quality, or the like.
- Example user interfaces 1006 and 1008 illustrate a movement exercise interface and a progress update interface, respectively.
- the movement exercise shown in user interface 1006 facilitates a range of motion test for a patient.
- the patient may select to start, snooze, or dismiss the range of motion test.
- the patient may perform the range of motion test and be provided feedback using a wrist-worn device displaying the user interface 1006 , which may include a sensor for detecting the movement.
- the range of motion may be measured at various locations of the patient's arm, such as flexion or extension, horizontal adduction, abduction, caption, internal rotation and external rotation, arm at side, arm at 90 -degree abduction, extension or internal rotation, or the like.
- the progress update interface as shown in user interface 1008 illustrates a daily living task (e.g., putting on a t-shirt) that the patient has performed (e.g., determined at the suggestion of a user interface, or upon information supplied by the patient, or based on sensor data indicating a particular movement corresponding to putting on a t-shirt).
- the progress update interface may include a question about pain, which may be used by a clinician to monitor patient pain over time or with particular tasks.
- An example task may include reaching above a specified height, such as for washing hair, putting on a t-shirt, brushing hair reaching above head to a shelf, etc.
- Another example task may include detecting extension and internal rotation, such as putting a jacket on, tucking in a. shirt, putting a bra on, etc.
- a patient may be tracked before, during, and after a surgical procedure.
- an initial state may be captured for the patient, optionally before a disease state occurs.
- the patient state may be captured during a disease state before the surgical procedure.
- the patient's walking style may be captured when healthy or diseased pre-operatively.
- the patient state may be captured (e.g., the walking style).
- the post-operative state may be compared to the pre-operative state (diseased or healthy or both), and results may indicate a change or maintenance of a walking style.
- the patient may be shown, pre-operatively, a video simulating how a walking style or other patient state will be post-operatively.
- the transformation may be shown to the patient to encourage the patient to undertake the surgical procedure to improve their state.
- the patient may be shown, post-operatively, a healthy state or improved state to encourage the patient to put in effort on exercises or physical therapy to improve their state to the healthy or improved state.
- the simulations may be shown based on a skeletal model and historical data of patients similar to the current patient (e.g., having similar or matching disease states, comorbidities, demographics, etc.).
- FIG. 11 illustrates a flowchart showing a technique 1100 for generating a skeletal model in accordance with at least one example of this disclosure.
- the technique 1100 includes an operation 1102 to capture a series of images of a user including a plurality of markers.
- the technique 1100 includes an operation 1104 to generate a training set using identified locations of the plurality of markers throughout the series of images, the identified locations of the plurality of markers corresponding to locations of joints of the user.
- the technique 1100 includes an operation 1106 to train a skeletal model, using the training set, to recognize performed exercises from an exercise library.
- the technique 1100 includes an operation 1108 to output the trained skeletal model configured to be used on a mobile device with images captured by a camera of the mobile device to recognize a performed exercise from the exercise library.
- the images captured by the camera of the mobile device may be 2D images (e.g., the camera is not a depth camera).
- the recognition of the performed exercised (e.g., performed by a user) may be performed automatically (e.g., without a need for additional processing of the captured images).
- FIG. 12 illustrates a flowchart showing a technique 1200 for determining a gait type of a patient in accordance with at least one example of this disclosure.
- a gait type may be a classification, for example based on training data (e.g., labeled by a gait expert, surgeon, or the like).
- the classification may be generated by a machine learning model, generated using a comparison, etc.
- the gait type may be used in various surgical or post-operative techniques.
- a surgical procedure for a total or partial knee replacement may consider a kinematic alignment of the knee, rather than a mechanical alignment.
- a mechanical alignment a knee is aligned to have a minimal or zero varus/valgus angle, and aligned to support the body mechanically.
- this type of idealized solution may not be as comfortable to the patient, causing dissatisfaction with the procedure.
- a kinematic alignment allows a surgeon to align the knee according to how the patient moves, including optionally leaving the knee a few degrees (e.g., 1-4) varus, or a few degrees valgus.
- the kinematic alignment often leaves patients with a more natural feeling knee, improving outcomes by reducing discomfort.
- the kinematic alignment may improve ligament function post-operatively as well.
- One technique for determining kinematic alignment information for use in a surgical procedure includes determining the pre-operative gait of the patient.
- the pre-operative gait may be used to determine an alignment target for surgery, or may be used post-operatively for physical therapy to help the patient return to the pre-operative gait.
- the pre-operative gait may be assessed in a healthy knee to return a patient to a pre-ailment state.
- the gait determined for the patient may be specific to the patient, and one or more gait types may be identified for the patient's gait, such as by a classifier.
- a set of gait types may be generated using labeled training data, and a machine learning model may be trained on the labeled set of gait types.
- the patient's gait may be run through the machine learning model to output a gait type classification.
- the gait type may then be applied to surgical or post-operative uses.
- the gait type may be used to determine instant loading on the knee at various stages of gait (or rising from a seated position).
- the loading on the knee may be used with kinematic alignment to identify pre-operative wear to the native knee.
- Aspects of a surgical procedure including an implanted knee may be determined from the kinematic alignment, pre-operative wear, or loading, such as a bearing surface, bone to prosthesis interface, or soft tissue restraints.
- the gait type may be determined based on pre-operative gait analysis, such as using foot plate heel toe-strike analysis or catwalk or treadmill pressure mapping, which may be compared to dynamic image derived information.
- pre-operative gait analysis such as using foot plate heel toe-strike analysis or catwalk or treadmill pressure mapping, which may be compared to dynamic image derived information.
- two factors may be used, including a force of where the foot hits the ground, and a sway of how a patient walks (e.g., sway left or right, or more rigidly straight ahead).
- the distance covered by swaying, the speed of sway, the force on heel or toe, the transfer of weight time from heel to toe, range of motion, speed of walking, stiffness of walking, or other gait factors may be used as weights for a classifier to identify a gait type.
- a classifier e.g., a machine learning model
- a patient's gait may be input to the classifier to output a knee loading pattern (e.g., a kinematic alignment), according to an example.
- a recommendation for each of the gait types may be output (e.g., an implant plan, angles, etc.).
- the gait types may be stored in a database including patient gait profiles (e.g., by demographics).
- the patient may be given information based on the gait type. For example, how others with this gait type have faired with various techniques (e.g., mechanical vs. kinematic alignment), what pain or discomfort may be expected with this gait type, a recovery time frame for this gait type, or the like.
- various techniques e.g., mechanical vs. kinematic alignment
- a gait type may be used to determine an optimal functional orientation for the femoral or acetabular components during a hip replacement.
- individual variation in pelvic obliquity on standing lying and rising from a seated to standing position may be determined from the gait type.
- the gait type may be used for custom targeting of component positioning.
- the technique 1200 includes an operation 1202 to capture pre-operative video of a patient.
- the technique 1200 includes an operation 1204 to identify a pre-operative patient gait based on the video, the patient gait identified from walking movement performed by the patient in the pre-operative video.
- the technique 1200 includes an operation 1206 to determine a gait type by comparing the pre-operative patient gait to stored gaits.
- the gait type includes a walking speed.
- the gait type is determined by comparing a force of a foot hitting the ground and a sway of hips of the patient from the pre-operative patient gait to corresponding information of the stored gaits.
- the gait type includes a walking stiffness.
- the gait type includes a pain value.
- the stored gaits are stored in a database of patient gait profiles.
- the gait type is determined using a machine learning model.
- the machine learning model is trained using recognizable gait patterns from a plurality of patients having a plurality of co-morbidities and a plurality of gait types.
- the machine learning model is trained to correlate gait characteristics with knee loading patterns
- the machine learning model is configured to develop a recommended intervention plan for each of a plurality of gait types.
- the technique 1200 includes an operation 1208 to generate an intervention plan based on the gait type.
- the intervention plan includes a kinematic alignment of a knee of the patient, the kinematic alignment including 1-4 degrees of varus.
- the intervention plan includes a range of motion of the patient.
- the intervention plan includes a relative patient score based on the gait type.
- the intervention plan includes a surgical procedure.
- the technique 1200 includes an operation 1210 to output information indicative of the intervention plan.
- FIG. 13 illustrates a flowchart showing a technique for analyzing movement of an orthopedic patient in accordance with at least one example of this disclosure.
- the technique 1300 includes an operation 1302 to capture images (e.g., using a camera of a user device) of a patient in motion attempting to perform a task.
- images e.g., using a camera of a user device
- the images captured after completion of an orthopedic surgery on the patient such as a knee surgery, a hip surgery, etc.
- the technique 1300 includes an operation 1304 to analyze (e.g., using a processor) the motion to generate a movement metric of the patient corresponding to the task.
- a movement metric may include a range of motion, a pain level, a gait type, a task completion amount, a degree varus or valgus, a limp amount, or the like.
- Operation 1304 may include using data captured by a neck-worn device related to the motion.
- Operation 1304 may include determining a gait characteristic.
- a gait model may be used to determine the gait characteristic.
- the gait model may be generated from a skeletal model of the patient, for example based on pre-operative images of the patient wearing a plurality of colored markers captured by a camera.
- the gait model may be generated from pre-operative images of the patient wearing a suit having a pattern of markers.
- the suit may be a whole body suit or may cover only a part of the patient's body.
- the technique 1300 includes an operation 1306 to compare the movement metric of the patient to a baseline metric for the task.
- the baseline metric may be based on or represent an average performance of the tasks among a population (e.g., of patients with similar comorbidities, age, gender, size, weight, etc.) or the baseline metric may be based on or represent a pre-operative attempt by the patient attempting to perform the task.
- a task may include walking, moving a body part, performing an occupational therapy movement (e.g., everyday activity or movement, such as placing a cup on a shelf, putting on clothes, etc.), performing a physical therapy movement, or the like.
- the technique 1300 includes an operation 1308 to output information indicative of the comparison.
- the information indicative of the comparison may include quantitative information or qualitative information.
- the quantitative information may include a score (e.g., a range of motion score or a pain score).
- the qualitative information may include feedback, such as positive reinforcement (e.g., ‘good job’), instructions (e.g., ‘try walking for 5 minutes each hour’), or adherence information related to the task, for example based on a milestone (e.g., completing a specified range of motion without pain).
- the qualitative information may be provided (e.g., via a speaker or a display, such as a display of a wearable device communicatively coupled to a user device) to the patient or a member of a care team for the patient (e.g., a family member, a therapist, a surgeon, or the like).
- a speaker or a display such as a display of a wearable device communicatively coupled to a user device
- a member of a care team for the patient e.g., a family member, a therapist, a surgeon, or the like.
- the technique 1300 may include determining whether performing the task caused the patient pain based on facial expressions of the patient in the captured images while the task was performed.
- the technique 1300 may include determining whether performing the task caused the patient pain based on a detected increase in heart rate of the patient, captured by a wearable device communicative coupled to the user device, while the task was performed.
- Example 1 is a method comprising: capturing range of motion or fluidity information using a sensor of a wrist-worn device; measure a stress response (e.g., heart rate) as proxy for pain during arm motions; identify a pain location based on a position of the wrist-worn device at a time of the stress response; tie these heartrate spikes to the PDF/animation to visualize the pain on the model.
- a stress response e.g., heart rate
- Example 2 is a method comprising: analyze the angle of selfies taken using phone camera to get data regarding shoulder flexibility; output information to a wrist-worn device.
- Example 3 is a method comprising: capturing range of motion or fluidity information using a sensor of a wrist-worn device; optionally performing one or more actions described below from examples 14-18; recommending surgery if needed; using pre-operative information captured by the wrist-worn device for evaluating post-operative range of motion, fluidity, or pain of patient (e.g., as captured by the wrist-worn device post-operatively).
- Example 4 the subject matter of Examples 1-3 includes, analyzing the angle of selfies taken using phone camera to get data regarding shoulder flexibility.
- Example 5 the subject matter of Examples 1-4 includes, extrapolating elbow pain, fluidity, or range of motion.
- Example 6 the subject matter of Examples 1-5 includes, providing a picture or animation of the range of motion—export the animation into a small file format illustrating the motion—click on the patient you can see the path of motion (e.g., in wrist-worn device).
- Example 7 the subject matter of Examples 1-6 includes, building a skeletal model.
- Example 8 the subject matter of Example 7 includes, improving the model dynamics by prompting user with questions (e.g., generate a model based on training data); for example by using a gyroscope on the wrist-worn device to compare the current data to a range of expectations; or obtaining a range of expectations from an opposite arm (e.g., baseline assessments on good shoulder); or developing a baseline from data coming from healthy population; or using data collected from user input, wrist-worn device sensors, or the like to improve implant products and techniques.
- questions e.g., generate a model based on training data
- an opposite arm e.g., baseline assessments on good shoulder
- developing a baseline from data coming from healthy population e.g., baseline assessments on good shoulder
- Example 9 the subject matter of Example 8 includes, recognizing patient activities without requiring the patient to manually log what the activity was (e.g., use the model), for example determine what action was performed by the patient, such as PROMS: e.g., put on a coat, pull a coffee mug from the cupboard, or the like.
- PROMS e.g., put on a coat, pull a coffee mug from the cupboard, or the like.
- Example 10 the subject matter of Examples 1-9 includes, measuring movement during sleep.
- Example 11 the subject matter of Examples 1-10 includes, capturing a maximum range of motion from day to day activities.
- Example 12 the subject matter of Example 11 includes, tracking steps taken by the patient, which may include also monitoring or receiving data regarding a maximum movement of the shoulder.
- Example 13 the subject matter of Examples 1-12 includes, using automatic identification technology to recognize rotation of the joint.
- Example 14 the subject matter of Examples 1-13 includes, detecting potentially harmful movements, for example too active, movement is unclean, deviating from plane while exercising, exceeding recommended ROM, or the like.
- Example 15 the subject matter of Example 14 includes, using haptic feedback on wrist-worn device or audio output by a speaker of the wrist worn device to alert patient to the potentially harmful movements.
- Example 16 the subject matter of Examples 14-15 includes, determining that the patient is sleeping on a wrong side, and alerting the patient (e.g., waking the patient up).
- Example 17 the subject matter of Examples 14-16 includes, determining that a repetitive stress is occurring to the patient, for example like a pitch count.
- Example 18 the subject matter of Examples 1-17 includes, providing feedback, including at least one of providing a list of activities the patient either can't do or hasn't done; providing activity specific guidance—“we have data collected on tennis activity—[rehab or surgery] will allow you to play tennis better”; providing a high level assessment—“you would be a candidate for X surgery”; providing proactive positive reinforcement that rehab is going well if that is what the data indicates; providing a video message to patient from surgeon that is keyed to customized thresholds for range of motion: e.g., range of motion looks good or range of motion needs work, or the like.
- Example 19 is a method comprising: capturing images, using a camera of a user device, of a patient in motion attempting to perform a task, the images captured after completion of an orthopedic surgery on the patient; analyzing, using a processor, the motion to generate a movement metric of the patient corresponding to the task; comparing, using the processor, the movement metric of the patient to a baseline metric for the task; and presenting, on a display, an indication of the comparison including a qualitative result of the comparison.
- Example 20 the subject matter of Example 19 includes, wherein the display is a display of a wearable device communicatively coupled to the user device.
- Example 21 the subject matter of Examples 19-20 includes, wherein analyzing the motion includes using data captured by a neck-worn device related to the motion.
- Example 22 the subject matter of Examples 19-21 includes, wherein the baseline metric represents an average performance of the task among a population.
- Example 23 the subject matter of Examples 19-22 includes, wherein the baseline metric represents a pre-operative attempt by the patient attempting to perform the task.
- Example 24 the subject matter of Examples 19-23 includes, wherein the task includes walking, and wherein analyzing the motion includes determining a gait characteristic.
- Example 25 the subject matter of Example 24 includes, wherein analyzing the motion includes using a gait model generated from a skeletal model of the patient based on pre-operative images of the patient wearing a plurality of colored markers captured by the camera.
- Example 26 the subject matter of Examples 24-25 includes, wherein analyzing the motion includes using a gait model generated from pre-operative images of the patient wearing a suit having a pattern of markers.
- Example 27 the subject matter of Examples 19-26 includes, determining whether performing the task caused the patient pain based on facial expressions of the patient in the captured images while the task was performed, and wherein presenting the indication further includes presenting information related to the pain.
- Example 28 the subject matter of Examples 19-27 includes, determining whether performing the task caused the patient pain based on a detected increase in heart rate of the patient, captured by a wearable device communicative coupled to the user device, while the task was performed, and wherein presenting the indication further includes presenting information related to the pain.
- Example 29 the subject matter of Examples 19-28 includes, wherein the qualitative result includes adherence information related to the task and wherein the qualitative result is based on a milestone.
- Example 30 the subject matter of Examples 19-29 includes, wherein the qualitative result is sent to a member of a care team for the patient.
- Example 31 is a method comprising: capturing range of motion information using a sensor of a wrist-worn device of a patient attempting to perform a task, the range of motion information captured after completion of an orthopedic surgery on the patient; determining a stress response, based on a heart rate measured by the sensor, as a proxy for pain during arm motion of a patient; identifying a pain location based on a position of the wrist-worn device at a moment of the stress response; presenting, on a display of the wrist-worn device, an indication related to the pain location including a qualitative result.
- Example 32 the subject matter of Examples 19-31 includes, analyzing, using a processor, the pain location at the moment to generate a movement metric; and comparing, using the processor, the movement metric of the patient to a baseline metric for the task; and wherein the qualitative result corresponds to the comparison.
- Example 33 the subject matter of Examples 31-32 includes, wherein presenting the indication related to the pain location includes providing a picture or animation of the range of motion, the picture or animation indicating the pain location.
- Example 34 the subject matter of Examples 31-33 includes, automatically identifying the task without patient input and initiating the range of motion capture in response to automatically identifying the task without patient input.
- Example 35 the subject matter of Examples 31-34 includes, wherein the qualitative result indicates that the range of motion includes a potentially harmful movement.
- Example 36 the subject matter of Example 35 includes, alerting the patient to the potentially harmful movement using haptic feedback on the wrist-worn device or audio output by a speaker of the wrist worn device.
- Example 37 the subject matter of Examples 31-36 includes, determining that the patient is sleeping on a wrong side, and alerting the patient using the wrist-worn device.
- Example 38 is a method comprising: capturing pre-operative video of a patient; identifying a pre-operative gait of the patient based on walking movement performed by the patient in the pre-operative video; determining a gait type by comparing the pre-operative gait to a plurality of stored gaits; generating an orthopedic intervention plan for the patient based on the gait type; and outputting information indicative of the intervention plan for display.
- Example 39 is at least one machine-readable medium including instructions that, when executed by processing circuitry, cause the processing circuitry to perform operations to implement of any of Examples 1-38.
- Example 40 is an apparatus comprising means to implement of any of Examples 1-38.
- Example 41 is a system to implement of any of Examples 1-38.
- Example 42 is a method to implement of any of Examples 1-38.
- Method examples described herein may be machine or computer-implemented at least in part. Some examples may include a computer-readable medium or machine-readable medium encoded with instructions operable to configure an electronic device to perform methods as described in the above examples.
- An implementation of such methods may include code, such as microcode, assembly language code, a higher-level language code, or the like. Such code may include computer readable instructions for performing various methods. The code may form portions of computer program products. Further, in an example, the code may be tangibly stored on one or more volatile, non-transitory, or non-volatile tangible computer-readable media, such as during execution or at other times.
- Examples of these tangible computer-readable media may include, but are not limited to, hard disks, removable magnetic disks, removable optical disks (e.g., compact disks and digital video disks), magnetic cassettes, memory cards or sticks, random access memories (RAMs), read only memories (ROMs), and the like.
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Medical Informatics (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Biomedical Technology (AREA)
- Physics & Mathematics (AREA)
- Pathology (AREA)
- Biophysics (AREA)
- Molecular Biology (AREA)
- Veterinary Medicine (AREA)
- Animal Behavior & Ethology (AREA)
- Surgery (AREA)
- Heart & Thoracic Surgery (AREA)
- Physiology (AREA)
- Epidemiology (AREA)
- Primary Health Care (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Dentistry (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Multimedia (AREA)
- Databases & Information Systems (AREA)
- Psychiatry (AREA)
- Human Computer Interaction (AREA)
- Radiology & Medical Imaging (AREA)
- Cardiology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Social Psychology (AREA)
- Hospice & Palliative Care (AREA)
- Pain & Pain Management (AREA)
- Physical Education & Sports Medicine (AREA)
- Geometry (AREA)
- Business, Economics & Management (AREA)
- General Business, Economics & Management (AREA)
- Quality & Reliability (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
Description
- This application claims the benefit of priority to U.S. Provisional Applications No. 62/836,338, filed Apr. 19, 2019, titled “Mobile Application For Upper Extremity Patient Care”; 62/853,425, filed May 28, 2019, titled “Mobile Application For Upper Extremity Patient Care”; and 62/966,438, filed Jan. 27, 2020, titled “Mobile Application For Upper Extremity Patient Care”, each of which is hereby incorporated herein by reference in its entirety.
- Orthopedic patient care may require surgical intervention, such as for upper extremities (e.g., a shoulder or elbow), knee, hip, etc. For example when pain becomes unbearable for a patient, surgery may be recommended. Postoperative care may include immobility of a joint ranging from weeks to months, physical therapy, or occupational therapy. Immobilization within the upper extremity may lead to long term issues, such as “Frozen shoulder” where a shoulder capsule thickens and becomes stiff and tight. Physical therapy or occupational therapy may be used to help the patient with recovering strength, everyday functioning, and healing. Current techniques involving immobility, physical therapy, or occupational therapy may not monitor or adequately assess range of motion or for pain before or after surgical intervention.
- In the drawings, which are not necessarily drawn to scale, like numerals may describe similar components in different views. Like numerals having different letter suffixes may represent different instances of similar components. The drawings illustrate generally, by way of example, but not by way of limitation, various embodiments discussed in the present document.
-
FIG. 1 illustrates an upper extremity monitoring system in accordance with at least one example of this disclosure. -
FIG. 2 illustrates a device for running a mobile application for upper extremity patient care in accordance with at least one example of this disclosure. -
FIG. 3 illustrates a user interface for use with a mobile application for upper extremity patient care in accordance with at least one example of this disclosure. -
FIG. 4 illustrates an example range of motion image or video user interface component in accordance with at least one example of this disclosure. -
FIG. 5 illustrates a flowchart showing a technique for using a mobile application for upper extremity patient care in accordance with at least one example of this disclosure. -
FIG. 6 illustrates a block diagram of an example machine upon which any one or more of the techniques discussed herein may perform in accordance with at least one example of this disclosure. -
FIG. 7 illustrates a diagram showing markers on a person for training a machine learning model in accordance with at least one example of this disclosure. -
FIG. 8 illustrates an example convolutional neural network in accordance with at least one example of this disclosure. -
FIG. 9 illustrates an example of reinforcement learning for a neural network in accordance with at least one example of this disclosure. -
FIG. 10 illustrates various example user interfaces for use with a user needing or wearing a sling in accordance with at least one example of this disclosure. -
FIG. 11 illustrates a flowchart showing a technique for generating a skeletal model in accordance with at least one example of this disclosure. -
FIG. 12 illustrates a flowchart showing a technique for determining a gait type of a patient in accordance with at least one example of this disclosure. -
FIG. 13 illustrates a flowchart showing a technique for analyzing movement of an orthopedic patient in accordance with at least one example of this disclosure. - Systems and methods described herein may be used for presenting motion feedback for an orthopedic patient. In an example, images may be captured of a patient in motion attempting to perform a task, for example after completion of an orthopedic surgery on the patient. The images may be analyzed to generate a movement metric of the patient corresponding to the task. The movement metric may be compared to a baseline metric (e.g., an average metric or a previous patient metric) for the task. An indication of the comparison may be presented, for example including a qualitative result of the comparison.
- Systems and methods described herein may be used to provide, assess, or augment orthopedic patient care (e.g., upper extremity, hip, knee, etc.). These systems and methods may include pain or range of motion assessment of an upper extremity, providing feedback or information to a patient regarding an upper extremity, or augmenting patient care with physical therapy, occupational therapy, warnings, or the like for an upper extremity.
- Upper extremity procedures may include, measuring motion in more than just one plane—e.g., adduction, which may be unlike large joint (hip & knee arthroplasty). In an example, upper extremities may include elbow or shoulder. Elbow may be more of a trauma procedure and relatively rare compared to shoulder. Unlike large joint procedures, there may be an opportunity to monitor the patient and provide feedback prior to the decision on surgical intervention. There also may be a range of surgical interventions of varying invasiveness/significance leading up to total shoulder replacement.
- Hip and knee replacements or orthopedic procedures may result in changes to gait, range of motion, or pain. Postoperative care may include physical or occupational therapy. The systems and methods described herein provide analysis of postoperative care of an orthopedic patient.
-
FIG. 1 illustrates an upperextremity monitoring system 100 in accordance with at least one example of this disclosure. - The
system 100 includes a first wrist-worn device 102 (e.g., a smart watch), and optionally includes a second wrist-worn device 104 or a neck-worn device 106. - One or more of the devices 102-106 may be used as a standalone device, or may work in communication with a mobile phone or with one or more of the other devices 102-106.
- One or more of the devices 102-106 may be used to gathers data on steps, floors, gait, or the like. One or more of the devices 102-106 may be used to delivers notification to a user.
- In an example, one or more of the devices 102-106 may be used to capture range of motion data or movement data, such as shoulder movement or range of motion data (e.g., adduction/abduction, flexion/extension, internal/external rotation), elbow movement or range of motion data (e.g., flexion/extension), wrist movement or range of motion data (e.g., pronation/supination), or the like.
- Qualitative or quantitative data collection may be obtained. In an example for shoulder pain or a shoulder recommendation or procedure, raw range of motion (ROM) data may not be as valuable as describing the type of movement the patient is capable of or the pain in a patient. In an example, one or more of the devices 102-106 may be used to extrapolate elbow information, for example, when a second sensor or device is not used near the elbow. In another example, one or more of the devices 102-106 may be used near the elbow to detect elbow movement, pain, or range of motion data.
- Having a picture or animation of the range of motion may be presented on one or more of the devices 102-106 or on a mobile phone. For example, an animation may be sent as a small file format to illustrate the motion on one or more of the devices 102-106. When a user clicks on a patient image, the user may be presented with the path of motion on one or more of the devices 102-106.
- In an example, one or more of the devices 102-106 may include a bigger, more comfortable watch band than typical for wrist-worn watches.
-
FIG. 2 illustrates adevice 202 for running a mobile application for upper extremity patient care in accordance with at least one example of this disclosure. Thedevice 202 may be in communication with a phone or othermobile device 204. Thedevice 202 may be a smart device, such as a smart watch, or other device with a user interface. - In an example, the
device 202 includes a processor, memory, a display (e.g., for displaying a user interface), a transceiver (e.g., for communicating with the mobile device 204), and optionally an inertial measurement unit (IMU), a gyroscope, or an accelerometer. - The
device 202 may be used for advanced data collection. For example, thedevice 202 may be used to measure stress response (e.g., heart rate) for example, as a proxy for pain during arm motions. These heartrate spikes may be tied to an animation to visualize the pain on the model. - The
device 202 may be used for “Sling management”—for example, by taking a picture of a sling worn by a patient. Thedevice 202 may be used to analyze an angle of selfies taken using phone camera to obtain data regarding shoulder flexibility. Thedevice 202 may be used to measure movement in sleep. Thedevice 202 may be used for determining a maximum range of motion. For example, thedevice 202 may be used to capture max range of motion from day to day activities (e.g., rather than evaluation setting in a hospital/clinical setting). - When tracking steps, the
device 202 may be used to look for max movement of the shoulder. Thedevice 202 may be used to recognize rotation of the joint. The device prompts on thedevice 202 may be used to obtain a user's context—for example, is the user using a 1 lb weight, a 2 lb weight, etc. For a user carrying themobile device 204 in a pocket, reliablemobile device 204 position may be used as another sensor, for example to track movements. - The
device 202 may be used to derive data from combining the accelerometer with an altimeter—certain motions may only occur from certain positions. Thedevice 202 may be used to compare movement information to a range of expectations—for example, the gyroscope on thedevice 202 may be used to compare the current data to a range of expectations. - The
device 202 may be used to obtain a range of expectations from a non-operative arm—for example, baseline assessments on a good shoulder of a patient. Thedevice 202 may be used to develop a baseline from data coming from healthy users. Thedevice 202 may be used to collect data to improve implant products and techniques. - Advanced feedback may be generated, for example, by identifying potentially harmful movements—too active, movement is unclean, deviating from plane while exercising, exceeding recommended ROM, etc. The
device 202 may provide haptic feedback or audio to alert the patient. The alert or feedback may include a warning, such as an indication that the user is sleeping on the wrong side. Thedevice 202 may be used to provide an alarm and wake the patient up. The alert or feedback may include a warning for repetitive stress (e.g., a pitch count for an arm). Thedevice 202 and themobile device 204 may be used in combination to provide a target in augmented reality. Thedevice 202 and themobile device 204 may be used in combination, for example using a camera of themobile device 204 to record at least one repetition of a movement (e.g., automatically captured when the user starts based on sensor data recorded by the device 202). The recording may be sent to a physical therapist or doctor to review. -
FIG. 3 illustrates a user interface for use with a mobile application for upper extremity patient care in accordance with at least one example of this disclosure. The user interface may be displayed on adevice 302, such as a smart watch. - The
device 302 may show an animation, an image, a view of the user, or the like. For example, thedevice 302 may display a picture or animation of a range of motion. In an example, a user may select an image of a patient and the path of motion may be displayed in thedevice 302 in response. - The range of motion and fluidity of movement may be detected using the
device 302. In an example, a model of dynamics of a user's body may be generated. For example, a skeletal model may be generated. The model dynamics may be improved by prompting the user with questions, such as “what activity did you just do”, for example on thedevice 202 or a mobile phone. A model may be trained on this data. In an example, a machine learning model may be trained. The model may be used to automatically recognize patient activities (ear, movements, pain, exercises, etc.), such as without requiring the patient to manually log the activity. In an example, daily living activities may be identified automatically. In an example, the dynamic model may include an avatar, an emoji (e.g., an Animoji—an animal emoji), or the like. In an example, activities may include Passive Range of Motion (PROM) actions: e.g., put on a coat, pull a coffee mug from the cupboard, etc. - The
device 302 may be used for monitoring and alerting a user to awareness of the user's situation (ear, movement, pain, range of motion, fluidity of movement, etc.), compared to other similarly situated users (e.g., by age, condition, etc.) or to an average user. Thedevice 302 may be used to inform a user that the user is worse off than an average person with the user's demographics. Thedevice 302 may monitor a patient that does not elect to have a procedure (e.g., surgery). - The
device 302 may monitor sleep, because users may wait until they cannot sleep to decide to get a procedure. Lack of sleep may be used as a notification trigger, and a notification may be displayed on thedevice 302. - The
device 302 may track motion of a user, and inform the user of results, such as “you are at 50% ability compared to an average person in your demographic.” Thedevice 302 may provide a list of activities the patient either cannot do or has not done. Thedevice 302 may provide activity specific guidance, for example “we have data collected on tennis activity—[rehab or surgery] will allow you to play tennis better.” Thedevice 302 may provide a high-level assessment, for example “you would be a candidate for X surgery,” “people of your profile got a surgical intervention,” or the like. - The
device 302 may provide objective data, for example that the patient is in excessive pain or has poor ROM, which may motivate the patient to seek a surgery election. Thedevice 302 may provide a list of doctors. - In an example, the
device 302 may provide a notification (e.g., related to lack of sleep, motion of a user, regarding objective data, or the like) to a user or a member of a care team that has the user as a patient (e.g., to a surgeon, a physical therapist, a nurse, etc.). The notification may be used by the member of the care team to prompt discussion with the patient regarding surgical intervention (e.g., a notification may include information indicating a surgical intervention is required or suggested.). In an example, the notification to the member of the care team may include an assessment of the patient automatically or may be stored for future access by a member of the care team with proper access credentials. The notification may include results collected by thedevice 302. - In an example, the
device 302 may provide telemedicine solutions. For example, a nurse practitioner may give virtual appointments through thedevice 302. Thedevice 302 may link patients to a surgeon in their area. Thedevice 302 may reduce surgeon communication burden, such as by providing proactive positive reinforcement that rehab is going well if that is what the data indicates. For example, “This is Dr. Wilson's office—everything looks to be going great.” - Spouse/family members drive calls to the surgeon, so they may also be informed. For example, the
device 302 may link to a personal assistance device (e.g., an Amazon Echo device from Amazon of Seattle, Wash. or a Google Home device from Google of Mountain View, Calif.) to provide an audible “good job” at home, perhaps from a recorded familiar voice (e.g., a nurse at the doctor's office). Thedevice 302 may connect with another display device to provide efficient engagement, for example a hologram or recording of the doctor. In an example, thedevice 302 may route initial calls to a call center (e.g., instead of the doctor's office). Thedevice 302 may play video messages to patient from surgeon that are keyed to customized thresholds for range of motion, for example “Range of motion looks good,” “Range of motion needs work,” or the like. - The
device 302 may provide early infection detection. For example, thedevice 302 may communicate with a small patch that can detect temperature via embedded temperature sensor. A temperature change may be sign of infection, and may be sent to thedevice 302, which may alert the user or a doctor. - In an example, the
device 302 may be used to monitor compliance with a prescribed prehab, rehab, or therapy program. A reward may be provided for compliance, for example a monetary reward from insurance company for compliance with rehab program. -
FIG. 4 illustrates an example 400 range of motion image or video user interface component in accordance with at least one example of this disclosure. The range of motion example 400 may be shown all at once or as a series of images. The range of motion example 400 may include augmented information (e.g., an indication of pain corresponding to a heart-rate spike). The heart-rate spike may be detected by a device worn by the user (e.g., a sensor patch, a wrist-worn device), by a video monitoring system (e.g., via a camera of a mobile phone), or the like. In an example, a video monitoring system may detect manifestations of pain, such as grimaces. In an example, audio monitoring may occur through a device or phone that may detect audible manifestations of pain. Video or audio monitoring may be performed by a home assistant device or service (e.g. Amazon Echo or Google Home), which may be integrated with the devices or systems described herein. - In an example, the range of motion example 400 illustrates a video captured of a user. In another example, the range of motion example 400 illustrates an animation of a motion shown for a user to emulate. In yet another example, the range of motion example 400 illustrates a video capture of an activity, performed by a professional (e.g., an actor, a doctor, a physical therapist, etc.).
-
FIG. 5 illustrates a flowchart showing atechnique 500 for using a mobile application for upper extremity patient care in accordance with at least one example of this disclosure. - The
technique 500 includes anoperation 502 to capturing range of motion or fluidity information using a sensor of a wrist-worn device.Operation 502 may include extrapolating elbow pain, fluidity, or range of motion.Operation 502 may include capturing a maximum range of motion from day to day activities.Operation 502 may further include tracking steps taken by the patient, which may include also monitoring or receiving data regarding a maximum movement of the shoulder.Operation 502 may include using automatic identification technology to recognize rotation of the joint.Operation 502 may include detecting potentially harmful movements, for example too active, movement is unclean, deviating from plane while exercising, exceeding recommended ROM, or the like.Operation 502 may include determining that the patient is sleeping on a wrong side, and alerting the patient (e.g., waking the patient up), further comprising determining that a repetitive stress is occurring to the patient, for example like a pitch count. - The
technique 500 includes anoperation 504 to providing feedback, for example, including at least one of providing a list of activities the patient either can't do or hasn't done; providing activity specific guidance—“we have data collected on tennis activity—[rehab or surgery] will allow you to play tennis better”; providing a high level assessment—“you would be a candidate for X surgery”; providing proactive positive reinforcement that rehab is going well if that is what the data indicates; providing a video message to patient from surgeon that is keyed to customized thresholds for range of motion: e.g., range of motion looks good or range of motion needs work, using haptic feedback on wrist-worn device or audio output by a speaker of the wrist worn device to alert patient to the potentially harmful movements, or the like. -
Operation 504 may include further comprising recognizing patient activities without requiring the patient to manually log what the activity was (e.g., use the model), for example determine what action was performed by the patient, such as PROMS: e.g., put on a coat, pull a coffee mug from the cupboard, or the like. In an example, after recognizing patient activities, the data from an action is logged, stored, and optionally communicated to a care team when the user is a patient. - The
technique 500 includes anoptional operation 506 to recommending surgery if needed. Thetechnique 500 includes anoptional operation 508 to using pre-operative information captured by the wrist-worn device for evaluating post-operative range of motion, fluidity, or pain of patient (e.g., as captured by the wrist-worn device post-operatively). - The
technique 500 includes anoperation 510 to measure a stress response (e.g., heart rate). Thetechnique 500 includes anoperation 512 to identify a pain location based on a position of the wrist-worn device at a time of the stress response. Thetechnique 500 includes anoptional operation 514 to tie these heartrate spikes to the PDF/animation to visualize the pain on the model. - The
technique 500 may include an operation to analyze the angle of selfies taken using phone camera to get data regarding shoulder flexibility. Thetechnique 500 may include an operation to provide a picture or animation of the range of motion—export the animation into a small file format illustrating the motion—click on the patient you can see the path of motion (e.g., in wrist-worn device). Thetechnique 500 may include an operation to measure movement during sleep. - The
technique 500 may include an operation to build a skeletal model. For example, this operation may include improving the model dynamics by prompting user with questions (e.g., generate a model based on training data); for example by using a gyroscope on the wrist-worn device to compare the current data to a range of expectations; or obtaining a range of expectations from an opposite arm (e.g., baseline assessments on good shoulder); or developing a baseline from data coming from healthy population; or using data collected from user input, wrist-worn device sensors, or the like to improve implant products and techniques. -
FIG. 6 illustrates a block diagram of anexample machine 600 upon which any one or more of the techniques discussed herein may perform in accordance with some embodiments. In alternative embodiments, themachine 600 may operate as a standalone device or may be connected (e.g., networked) to other machines. In a networked deployment, themachine 600 may operate in the capacity of a server machine, a client machine, or both in server-client network environments. In an example, themachine 600 may act as a peer machine in peer-to-peer (P2P) (or other distributed) network environment. Themachine 600 may be a personal computer (PC), a tablet PC, a set-top box (STB), a personal digital assistant (PDA), a mobile telephone, a web appliance, a network router, switch or bridge, or any machine capable of executing instructions (sequential or otherwise) that specify actions to be taken by that machine. Further, while only a single machine is illustrated, the term “machine” shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein, such as cloud computing, software as a service (SaaS), other computer cluster configurations. - Machine (e.g., computer system) 600 may include a hardware processor 602 (e.g., a central processing unit (CPU), a graphics processing unit (GPU), a hardware processor core, or any combination thereof), a
main memory 604 and astatic memory 606, some or all of which may communicate with each other via an interlink (e.g., bus) 608. Themachine 600 may further include adisplay unit 610, an alphanumeric input device 612 (e.g., a keyboard), and a user interface (UI) navigation device 614 (e.g., a mouse). In an example, thedisplay unit 610,input device 612 andUI navigation device 614 may be a touch screen display. Themachine 600 may additionally include a storage device (e.g., drive unit) 616, a signal generation device 618 (e.g., a speaker), anetwork interface device 620, and one ormore sensors 621, such as a global positioning system (GPS) sensor, compass, accelerometer, or other sensor. Themachine 600 may include anoutput controller 628, such as a serial (e.g., Universal Serial Bus (USB), parallel, or other wired or wireless (e.g., infrared (IR), near field communication (NFC), etc.) connection to communicate or control one or more peripheral devices (e.g., a printer, card reader, etc.). - The
storage device 616 may include a machinereadable medium 622 on which is stored one or more sets of data structures or instructions 624 (e.g., software) embodying or utilized by any one or more of the techniques or functions described herein. Theinstructions 624 may also reside, completely or at least partially, within themain memory 604, withinstatic memory 606, or within thehardware processor 602 during execution thereof by themachine 600. In an example, one or any combination of thehardware processor 602, themain memory 604, thestatic memory 606, or thestorage device 616 may constitute machine readable media. - While the machine
readable medium 622 is illustrated as a single medium, the term “machine readable medium” may include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) configured to store the one ormore instructions 624. The term “machine readable medium” may include any medium that is capable of storing, encoding, or carrying instructions for execution by themachine 600 and that cause themachine 600 to perform any one or more of the techniques of the present disclosure, or that is capable of storing, encoding or carrying data structures used by or associated with such instructions. Non-limiting machine-readable medium examples may include solid-state memories, and optical and magnetic media. - The
instructions 624 may further be transmitted or received over acommunications network 626 using a transmission medium via thenetwork interface device 620 utilizing any one of a number of transfer protocols (e.g., frame relay, internet protocol (IP), transmission control protocol (TCP), user datagram protocol (UDP), hypertext transfer protocol (HTTP), etc. Example communication networks may include a local area network (LAN), a wide area network (WAN), a packet data network (e.g., the Internet), mobile telephone networks (e.g., cellular networks), Plain Old Telephone (POTS) networks, and wireless data networks (e.g., Institute of Electrical and Electronics Engineers (IEEE) 802.11 family of standards known as Wi-Fi®, IEEE 802.16 family of standards known as WiMax®), IEEE 802.15.4 family of standards, peer-to-peer (P2P) networks, among others. In an example, thenetwork interface device 620 may include one or more physical jacks (e.g., Ethernet, coaxial, or phone jacks) or one or more antennas to connect to thecommunications network 626. In an example, thenetwork interface device 620 may include a plurality of antennas to wirelessly communicate using at least one of single-input multiple-output (SIMO), multiple-input multiple-output (MIMO), or multiple-input single-output (MISO) techniques. The term “transmission medium” shall be taken to include any intangible medium that is capable of storing, encoding or carrying instructions for execution by themachine 600, and includes digital or analog communications signals or other intangible medium to facilitate communication of such software. - Example systems and methods are described below for creating a skeletal model of a user. The skeletal model may be generated, in an example, without use of a depth camera or depth sensors or a gait lab. The systems and methods may be used to detect movements or a rep count from a video using deep learning techniques by creating a skeletal model using markers. For example, when doing physical therapy or a musculoskeletal movement analysis there may not be an easy or automated way to analyze the movements around joints to check range of motion or rep count. In some examples, it is infeasible to have a user use a depth camera or a gait lab, which may require too much time for each individual.
- The systems and methods described below may be used to train a skeletal model. The skeletal model is useful for tracking movement of a user, tracking reps (e.g., for a physical therapy activity or training regimen), tracking time a position is held, monitoring for correct performance of an exercise or movement, or the like. The automatic tracking may allow a user to focus on the movement or technique performed without worrying about needing to keep a count, time, or movement in their head. The tracked movement, reps, or time may be used for a physical therapy session, such as for rehab or strength training. In an example, the tracked information is used for rehab after an orthopedic surgery, for example to monitor user progress, provide feedback to the user or a clinician, or the like.
- A trained skeletal model may be used to process images of a user captured by a camera. The camera may be a camera of a mobile device (e.g., a cell phone), in an example. The camera does not need to be a depth camera or include any additional sensors beyond those of a digital camera.
-
FIG. 7 . Illustrates a diagram showing markers (e.g., 702-712 or 714) on a person (e.g., 701 or 703) for training a machine learning model in accordance with at least one example of this disclosure. In an example, the markers (e.g., 702-712) may be color coded (e.g., each marker or pair of markers, such as left-right pairs, may have different colors). In another example, the markers on theperson 701 may be black or have arbitrary colors. - The markers on the
person 701 may include items affixed to a person. The markers on theperson 701 may be arranged in particular locations on theperson 701, such as at joints (e.g.,shoulder 704,neck 702,elbow 706,wrist 708, or the like). - The markers on the
person 703 may include a fabric or suit with a pattern printed or designed on it. Themarker 714 is an example of an item printed or designed on an item worn by theperson 703. The markers on theperson 703 may be arranged randomly or arbitrarily, or may be evenly spaced or arranged, but do not necessarily have a particular arrangement. In an example, the more markers, the more accurate for training or for repeated use accuracy. The markers on theperson 703 may cover the person's entire body (e.g., in the example shown inFIG. 7 ) or may only cover a portion of the body. For example, a leg sheath with markers may be worn by theperson 703, giving coverage of only the leg wearing the leg sheath. Where clinical interest is in certain movements, (e.g., gait or movement of a leg), only covering a portion of theperson 703 corresponding to that interest may be sufficient for capturing the movements. In an example, a leg sheath may be attached via Velcro, a zipper, or affixed onto the person's leg. - The markers may be worn by a patient or a model user, such as for evaluation or training purposes respectively. Training may be conducted with a user making various movements using the markers, after which a trained model may be generated. The trained model may be used without markers, in an example.
- In an example, a training set for joint movements is created by using color coded markers at various joints (e.g., markers 702-712 on person 701). The image color markers may be placed on the
person 701 to track the movements with a camera (e.g., a camera of a mobile device, such as a mobile phone). The color markers may be used as landmarks for training a model as theperson 701 moves. A video may be captured, which may then be analyzed to determine how the markers are moved for reach prescribed movement. A skeletal model may be generated from the determined movement of theperson 701 via the markers (e.g., 702-712). In another example, a skeletal model may be generated via the markers (e.g., 714) on theperson 703 using similar techniques. - The training may be conducted with a plurality of people wearing the markers (e.g., 10-15 people) to generate an accurate model. The plurality of people may have diverging body movement, types, sizes, or positioning. The skeletal model that is trained using this technique may be applicable to a broad range of people when used for inferencing due to the different people used in training.
- In an example, the skeletal model may be updated recursively using later data as testing data (or the later data may be labeled for use as training data, in an example). In another example, a plurality of skeletal models may be generated, such as one for a particular type of patient (e.g., based on body size, ailment type, such as ankle, knee, shoulder, etc., or the like, or deformity, such as varus or valgus malalignment of the knee).
- The skeletal models described herein may be generated based on joints and change in joint angles. In an example, a skeletal model may be scaled to match a patient. In an example, the side of the
person 701 may be determined based on using top left corner as origin (0,0), then for example, using a technique such as left to right and top to bottom, with the first marker identified as the right side of theperson 701. - In an example, the markers may be used to later evaluate a patient (e.g., not in a training technique). The markers may allow for increased accuracy compared to not using markers. The skeletal model generated in a training phase may also be used in conjunction with a patient later wearing markers for increased accuracy. For example, a skeletal model may be trained using the markers (e.g., 702-712) on
person 701, andperson 703 may later be a patient wearing markers (e.g., 714) on their entire body or a. portion of their body. Movement of theperson 703 may be tracked using the markers or the trained skeletal model. Results of the movement tracking may include marker identification or modeled movement based on the skeletal model and captured video. The marker identification and modeled movement may be combined to increase accuracy. - The
person 703 may be tracked while moving, for example by capturing video of theperson 703. The video may be analyzed to determine a style of walking, in an example. The style may be categorized, for example based on the person's body size, habitus, gait or weight management, demographics, familial influences, or the like. The style may include factors, such as heel strike, walking pace, where theperson 703 holds their weight, sway, gait type, gait style, foot separation, stride length, or the like. Theperson 703 may be a patient, and their style may be captured before a surgical procedure, after, and optionally continue over time. The style before the surgical procedure may inform the surgical procedure, and style after the surgical procedure may be compared to the pre-style, along with pain or discomfort information for example, to determine patient satisfaction and how it relates to the walking style. In an example, the walking style may be used post-operatively to validate the patient's experience (e.g., informing the patient that the style has changed, which may contribute to a feeling of discomfort, but which may improve patient satisfaction by providing the patient understanding why this change has occurred). -
FIG. 8 illustrates an example convolutional neural network (CNN) 800 in accordance with at least one example of this disclosure. TheCNN 800 uses an image, for example captured by a camera, to identify a body outline and markers affixed to the body of a user. The locations of the markers relative to the body outline are determined using theCNN 800. TheCNN 800 may include a plurality of hidden layers used to make the identifications. In an example, other deep learning or machine learning techniques may be used to perform the identifications. - The
input image 802, which may be output from a camera is fed into the layers of theCNN 800. Each convolutional layer (e.g., layers 804, 806, 808, 810, and 812 may include one or more convolution and pooling layers. In an example,layer 810 may be a separate max pooling layer. Example neural network dimensions are shown for each stage, however, one of ordinary skill with the benefit of the present disclosure will appreciate that other dimensions may be utilized. - The
CNN 800 may be trained using theperson 701 ofFIG. 7 , by tracking captured images or video including identifying locations of markers affixed to theperson 701. TheCNN 800 may identify the markers at various joints on the person for obtaining a skeletal model. Once the location of the markers is identified, theCNN 800 may be used to determine a side of the body the markers are placed on, such as by using a distance function. Then theCNN 800 may use the color of the markers to determine where each marker fits in the skeletal model. Once the skeletal model is created using theCNN 800, a model for different exercises may be created, which may be used with the skeletal model. TheCNN 800 may be used after training to analyze movements and output inferences related to the movements, such as a rep count based on an exercise, for example. The skeletal model may be used to identify the exercises based on an exercise library, for example, which may be created using test subjects. In an example, when the model does not identify the exercise or does not correctly identify the exercise, reinforcement learning may be used to identify the correct exercise for theCNN 800. - The components of the
CNN 800 may include receiving an input image (e.g., captured by a mobile device) 802, and using theinput image 802 to identify features at different layers. For example,layer 804 may identify a body outline,layer 806 may identify markers on the body, andlayer 808 may identify locations of the markers with respect to the body. Themax pooling layer 810 may be used to downsample the results fromlayer 808. Then layer 812 may output coordinates, locations, or dimensions of markers identified on a body of a person in theinput image 802. -
FIG. 9 illustrates an example of reinforcement learning for a neural network in accordance with at least one example of this disclosure. Asystem 900 ofFIG. 9 may be used to perform the reinforcement learning. For example, a state and reward are sent from an environment to an agent, which outputs an action. The action is received by the environment, and iteratively the reinforcement learning is performed by updating state and reward values and sending those to the agent. - The
reinforcement learning system 900 may be used for post processing, for example on already recorded video or images or on real-time video or images, such as to measure an exercise as part of physical therapy or another activity. The skeletal model may be generated for example using theCNN 800 ofFIG. 8 , based on the markers ofFIG. 7 . The skeletal model may then be used with images or video captured on a camera, such as a camera of a mobile device (e.g., a phone), without needing or using markers during capture of the exercise. For example, the markers be used for training the skeletal model but do not need to be used for generating inferences using the skeletal model. In an example, markers may be used for inferences, such as to improve accuracy. In an example, a portion of a patient's body may have markers while performing the exercise (e.g., a leg sheath). - In an example, a patient's movements may be captured, an exercise identified, and a portion of user anatomy tracked, (optionally in real-time) via a camera of a mobile device (e.g., a phone) without markers on the patient (or with markers to improve accuracy) using the skeletal model that was trained using markers.
- Further image or video processing may be used (e.g., via the skeletal model) to identify information related to an exercise or movement performed by a patient. For example, a walking style may be identified for the patient. In another example, pain based on facial expressions may be identified during the exercise or movement. The pain may be correlated to a particular movement or moment during the exercise to automatically determine what caused the pain. Other techniques (e.g., via a smart watch or other wearable device) as described herein may be used to track movement or pain in addition to or instead of these techniques.
-
FIG. 10 illustrates various example user interfaces (e.g., 1002-1008) for use with a user needing or wearing a sling in accordance with at least one example of this disclosure. The user interfaces ofFIG. 10 may be displayed on a wrist-worn device, such as a smart-watch. The wrist-worn device may include a sensor, such as an accelerometer, to detect movement of an arm of a person wearing the wrist-worn device. - An example user may include a patient who has had a surgical procedure on their arm, shoulder, chest, etc., or a patient who injured their arm, shoulder, chest, etc. For example, the patient may wear a sling for stability or to prevent injury. Slings are effective, when properly used, at stabilizing patient anatomy, but compliance is often difficult, both for the patient to achieve, and for a clinician to measure or detect. In an example, a patient may have an orthopedic shoulder procedure and be instructed to wear a sling for a period of time (e.g., a few weeks).
- The user interfaces shown in
FIG. 10 illustrate various actions that may be detected using a sensor.User interface 1002 illustrates motion detection of the wrist-worn device suggestive of shoulder movement. In the example ofuser interface 1002, a patient may be instructed to keep their shoulder from moving, and theuser interface 1002 may be used to provide feedback to the patient that movement has occurred and a reminder to keep movement to a minimum. This information may be educational or serve as a reminder to the patient. Theuser interface 1004 illustrates an example where range of motion of the shoulder of the patient has exceeded a threshold. In this example, theuser interface 1004 warns the patient of the unhelpful movement. A patient may be asked to complete assessments of specific movements assigned at set intervals through their clinician's protocol. A video may instruct the patient how to complete the assessment. Metrics of the patient's performance may be measured during the assessment. For example, metrics may include speed (e.g., slow down or speed up movement to demonstrate muscular control and improve accuracy of measurement), plane (e.g., correction of movement or hand positioning to decrease impingement with specific movements and improve accuracy of measurement), smoothness (e.g., provide a metric indicating smoothness through the arc of motion), or compensation (e.g., detection of elbow flexion or scapular compensation with movements). Other metrics may include, when the patient has a sling, post-operative sling management, frequency of reaching above a certain height, max ROM during day, arm swing velocity while walking, heart rate during motion (e.g., for pain response), sleep duration or quality, or the like. -
1006 and 1008 illustrate a movement exercise interface and a progress update interface, respectively. The movement exercise shown inExample user interfaces user interface 1006 facilitates a range of motion test for a patient. The patient may select to start, snooze, or dismiss the range of motion test. The patient may perform the range of motion test and be provided feedback using a wrist-worn device displaying theuser interface 1006, which may include a sensor for detecting the movement. The range of motion may be measured at various locations of the patient's arm, such as flexion or extension, horizontal adduction, abduction, caption, internal rotation and external rotation, arm at side, arm at 90-degree abduction, extension or internal rotation, or the like. - The progress update interface as shown in
user interface 1008 illustrates a daily living task (e.g., putting on a t-shirt) that the patient has performed (e.g., determined at the suggestion of a user interface, or upon information supplied by the patient, or based on sensor data indicating a particular movement corresponding to putting on a t-shirt). The progress update interface may include a question about pain, which may be used by a clinician to monitor patient pain over time or with particular tasks. An example task may include reaching above a specified height, such as for washing hair, putting on a t-shirt, brushing hair reaching above head to a shelf, etc. Another example task may include detecting extension and internal rotation, such as putting a jacket on, tucking in a. shirt, putting a bra on, etc. - In an example, a patient may be tracked before, during, and after a surgical procedure. For example, an initial state may be captured for the patient, optionally before a disease state occurs. The patient state may be captured during a disease state before the surgical procedure. For example, the patient's walking style may be captured when healthy or diseased pre-operatively. Then after the surgical procedure, the patient state may be captured (e.g., the walking style). The post-operative state may be compared to the pre-operative state (diseased or healthy or both), and results may indicate a change or maintenance of a walking style. In another example, the patient may be shown, pre-operatively, a video simulating how a walking style or other patient state will be post-operatively. The transformation may be shown to the patient to encourage the patient to undertake the surgical procedure to improve their state. In yet another example, the patient may be shown, post-operatively, a healthy state or improved state to encourage the patient to put in effort on exercises or physical therapy to improve their state to the healthy or improved state. The simulations may be shown based on a skeletal model and historical data of patients similar to the current patient (e.g., having similar or matching disease states, comorbidities, demographics, etc.).
-
FIG. 11 illustrates a flowchart showing atechnique 1100 for generating a skeletal model in accordance with at least one example of this disclosure. - The
technique 1100 includes anoperation 1102 to capture a series of images of a user including a plurality of markers. - The
technique 1100 includes anoperation 1104 to generate a training set using identified locations of the plurality of markers throughout the series of images, the identified locations of the plurality of markers corresponding to locations of joints of the user. - The
technique 1100 includes anoperation 1106 to train a skeletal model, using the training set, to recognize performed exercises from an exercise library. - The
technique 1100 includes anoperation 1108 to output the trained skeletal model configured to be used on a mobile device with images captured by a camera of the mobile device to recognize a performed exercise from the exercise library. The images captured by the camera of the mobile device may be 2D images (e.g., the camera is not a depth camera). The recognition of the performed exercised (e.g., performed by a user) may be performed automatically (e.g., without a need for additional processing of the captured images). -
FIG. 12 illustrates a flowchart showing a technique 1200 for determining a gait type of a patient in accordance with at least one example of this disclosure. A gait type may be a classification, for example based on training data (e.g., labeled by a gait expert, surgeon, or the like). The classification may be generated by a machine learning model, generated using a comparison, etc. - The gait type may be used in various surgical or post-operative techniques. For example, a surgical procedure for a total or partial knee replacement may consider a kinematic alignment of the knee, rather than a mechanical alignment. In a mechanical alignment, a knee is aligned to have a minimal or zero varus/valgus angle, and aligned to support the body mechanically. However, this type of idealized solution may not be as comfortable to the patient, causing dissatisfaction with the procedure. A kinematic alignment, on the other hand, allows a surgeon to align the knee according to how the patient moves, including optionally leaving the knee a few degrees (e.g., 1-4) varus, or a few degrees valgus. The kinematic alignment often leaves patients with a more natural feeling knee, improving outcomes by reducing discomfort. The kinematic alignment may improve ligament function post-operatively as well.
- One technique for determining kinematic alignment information for use in a surgical procedure includes determining the pre-operative gait of the patient. The pre-operative gait may be used to determine an alignment target for surgery, or may be used post-operatively for physical therapy to help the patient return to the pre-operative gait. In an example, the pre-operative gait may be assessed in a healthy knee to return a patient to a pre-ailment state.
- The gait determined for the patient may be specific to the patient, and one or more gait types may be identified for the patient's gait, such as by a classifier. A set of gait types may be generated using labeled training data, and a machine learning model may be trained on the labeled set of gait types. The patient's gait may be run through the machine learning model to output a gait type classification. The gait type may then be applied to surgical or post-operative uses.
- The gait type may be used to determine instant loading on the knee at various stages of gait (or rising from a seated position). The loading on the knee may be used with kinematic alignment to identify pre-operative wear to the native knee. Aspects of a surgical procedure including an implanted knee may be determined from the kinematic alignment, pre-operative wear, or loading, such as a bearing surface, bone to prosthesis interface, or soft tissue restraints.
- The gait type may be determined based on pre-operative gait analysis, such as using foot plate heel toe-strike analysis or catwalk or treadmill pressure mapping, which may be compared to dynamic image derived information. In performing a gait analysis, two factors may be used, including a force of where the foot hits the ground, and a sway of how a patient walks (e.g., sway left or right, or more rigidly straight ahead). The distance covered by swaying, the speed of sway, the force on heel or toe, the transfer of weight time from heel to toe, range of motion, speed of walking, stiffness of walking, or other gait factors may be used as weights for a classifier to identify a gait type.
- A classifier (e.g., a machine learning model) may be trained to determine how gait characteristics correlate with knee loading patterns. After training, a patient's gait may be input to the classifier to output a knee loading pattern (e.g., a kinematic alignment), according to an example. A recommendation for each of the gait types may be output (e.g., an implant plan, angles, etc.). The gait types may be stored in a database including patient gait profiles (e.g., by demographics).
- In an example, once a gait type is determined, the patient may be given information based on the gait type. For example, how others with this gait type have faired with various techniques (e.g., mechanical vs. kinematic alignment), what pain or discomfort may be expected with this gait type, a recovery time frame for this gait type, or the like.
- Similar to the knee techniques described in the examples above, hip replacement, ankle procedures, foot procedures, or other lower extremity surgeries may use the gait type to improve patient outcomes. For example, a gait type may be used to determine an optimal functional orientation for the femoral or acetabular components during a hip replacement. In an example, individual variation in pelvic obliquity on standing lying and rising from a seated to standing position may be determined from the gait type. The gait type may be used for custom targeting of component positioning.
- The technique 1200 includes an
operation 1202 to capture pre-operative video of a patient. The technique 1200 includes anoperation 1204 to identify a pre-operative patient gait based on the video, the patient gait identified from walking movement performed by the patient in the pre-operative video. - The technique 1200 includes an
operation 1206 to determine a gait type by comparing the pre-operative patient gait to stored gaits. In an example, the gait type includes a walking speed. In an example, the gait type is determined by comparing a force of a foot hitting the ground and a sway of hips of the patient from the pre-operative patient gait to corresponding information of the stored gaits. In an example, the gait type includes a walking stiffness. In an example, the gait type includes a pain value. In an example, the stored gaits are stored in a database of patient gait profiles. - In an example, the gait type is determined using a machine learning model. In an example, the machine learning model is trained using recognizable gait patterns from a plurality of patients having a plurality of co-morbidities and a plurality of gait types. In an example, the machine learning model is trained to correlate gait characteristics with knee loading patterns
- In an example, the machine learning model is configured to develop a recommended intervention plan for each of a plurality of gait types.
- The technique 1200 includes an
operation 1208 to generate an intervention plan based on the gait type. In an example, the intervention plan includes a kinematic alignment of a knee of the patient, the kinematic alignment including 1-4 degrees of varus. In an example, the intervention plan includes a range of motion of the patient. In an example, the intervention plan includes a relative patient score based on the gait type. In an example, the intervention plan includes a surgical procedure. The technique 1200 includes anoperation 1210 to output information indicative of the intervention plan. -
FIG. 13 illustrates a flowchart showing a technique for analyzing movement of an orthopedic patient in accordance with at least one example of this disclosure. - The
technique 1300 includes anoperation 1302 to capture images (e.g., using a camera of a user device) of a patient in motion attempting to perform a task. In an example, the images captured after completion of an orthopedic surgery on the patient, such as a knee surgery, a hip surgery, etc. - The
technique 1300 includes anoperation 1304 to analyze (e.g., using a processor) the motion to generate a movement metric of the patient corresponding to the task. A movement metric may include a range of motion, a pain level, a gait type, a task completion amount, a degree varus or valgus, a limp amount, or the like.Operation 1304 may include using data captured by a neck-worn device related to the motion.Operation 1304 may include determining a gait characteristic. A gait model may be used to determine the gait characteristic. The gait model may be generated from a skeletal model of the patient, for example based on pre-operative images of the patient wearing a plurality of colored markers captured by a camera. In another example, the gait model may be generated from pre-operative images of the patient wearing a suit having a pattern of markers. The suit may be a whole body suit or may cover only a part of the patient's body. - The
technique 1300 includes anoperation 1306 to compare the movement metric of the patient to a baseline metric for the task. The baseline metric may be based on or represent an average performance of the tasks among a population (e.g., of patients with similar comorbidities, age, gender, size, weight, etc.) or the baseline metric may be based on or represent a pre-operative attempt by the patient attempting to perform the task. A task may include walking, moving a body part, performing an occupational therapy movement (e.g., everyday activity or movement, such as placing a cup on a shelf, putting on clothes, etc.), performing a physical therapy movement, or the like. - The
technique 1300 includes anoperation 1308 to output information indicative of the comparison. The information indicative of the comparison may include quantitative information or qualitative information. The quantitative information may include a score (e.g., a range of motion score or a pain score). The qualitative information may include feedback, such as positive reinforcement (e.g., ‘good job’), instructions (e.g., ‘try walking for 5 minutes each hour’), or adherence information related to the task, for example based on a milestone (e.g., completing a specified range of motion without pain). The qualitative information may be provided (e.g., via a speaker or a display, such as a display of a wearable device communicatively coupled to a user device) to the patient or a member of a care team for the patient (e.g., a family member, a therapist, a surgeon, or the like). - The
technique 1300 may include determining whether performing the task caused the patient pain based on facial expressions of the patient in the captured images while the task was performed. Thetechnique 1300 may include determining whether performing the task caused the patient pain based on a detected increase in heart rate of the patient, captured by a wearable device communicative coupled to the user device, while the task was performed. - Each of the following non-limiting examples may stand on its own, or may be combined in various permutations or combinations with one or more of the other examples.
- Example 1 is a method comprising: capturing range of motion or fluidity information using a sensor of a wrist-worn device; measure a stress response (e.g., heart rate) as proxy for pain during arm motions; identify a pain location based on a position of the wrist-worn device at a time of the stress response; tie these heartrate spikes to the PDF/animation to visualize the pain on the model.
- Example 2 is a method comprising: analyze the angle of selfies taken using phone camera to get data regarding shoulder flexibility; output information to a wrist-worn device.
- Example 3 is a method comprising: capturing range of motion or fluidity information using a sensor of a wrist-worn device; optionally performing one or more actions described below from examples 14-18; recommending surgery if needed; using pre-operative information captured by the wrist-worn device for evaluating post-operative range of motion, fluidity, or pain of patient (e.g., as captured by the wrist-worn device post-operatively).
- In Example 4, the subject matter of Examples 1-3 includes, analyzing the angle of selfies taken using phone camera to get data regarding shoulder flexibility.
- In Example 5, the subject matter of Examples 1-4 includes, extrapolating elbow pain, fluidity, or range of motion.
- In Example 6, the subject matter of Examples 1-5 includes, providing a picture or animation of the range of motion—export the animation into a small file format illustrating the motion—click on the patient you can see the path of motion (e.g., in wrist-worn device).
- In Example 7, the subject matter of Examples 1-6 includes, building a skeletal model.
- In Example 8, the subject matter of Example 7 includes, improving the model dynamics by prompting user with questions (e.g., generate a model based on training data); for example by using a gyroscope on the wrist-worn device to compare the current data to a range of expectations; or obtaining a range of expectations from an opposite arm (e.g., baseline assessments on good shoulder); or developing a baseline from data coming from healthy population; or using data collected from user input, wrist-worn device sensors, or the like to improve implant products and techniques.
- In Example 9, the subject matter of Example 8 includes, recognizing patient activities without requiring the patient to manually log what the activity was (e.g., use the model), for example determine what action was performed by the patient, such as PROMS: e.g., put on a coat, pull a coffee mug from the cupboard, or the like.
- In Example 10, the subject matter of Examples 1-9 includes, measuring movement during sleep.
- In Example 11, the subject matter of Examples 1-10 includes, capturing a maximum range of motion from day to day activities.
- In Example 12, the subject matter of Example 11 includes, tracking steps taken by the patient, which may include also monitoring or receiving data regarding a maximum movement of the shoulder.
- In Example 13, the subject matter of Examples 1-12 includes, using automatic identification technology to recognize rotation of the joint.
- In Example 14, the subject matter of Examples 1-13 includes, detecting potentially harmful movements, for example too active, movement is unclean, deviating from plane while exercising, exceeding recommended ROM, or the like.
- In Example 15, the subject matter of Example 14 includes, using haptic feedback on wrist-worn device or audio output by a speaker of the wrist worn device to alert patient to the potentially harmful movements.
- In Example 16, the subject matter of Examples 14-15 includes, determining that the patient is sleeping on a wrong side, and alerting the patient (e.g., waking the patient up).
- In Example 17, the subject matter of Examples 14-16 includes, determining that a repetitive stress is occurring to the patient, for example like a pitch count.
- In Example 18, the subject matter of Examples 1-17 includes, providing feedback, including at least one of providing a list of activities the patient either can't do or hasn't done; providing activity specific guidance—“we have data collected on tennis activity—[rehab or surgery] will allow you to play tennis better”; providing a high level assessment—“you would be a candidate for X surgery”; providing proactive positive reinforcement that rehab is going well if that is what the data indicates; providing a video message to patient from surgeon that is keyed to customized thresholds for range of motion: e.g., range of motion looks good or range of motion needs work, or the like.
- Example 19 is a method comprising: capturing images, using a camera of a user device, of a patient in motion attempting to perform a task, the images captured after completion of an orthopedic surgery on the patient; analyzing, using a processor, the motion to generate a movement metric of the patient corresponding to the task; comparing, using the processor, the movement metric of the patient to a baseline metric for the task; and presenting, on a display, an indication of the comparison including a qualitative result of the comparison.
- In Example 20, the subject matter of Example 19 includes, wherein the display is a display of a wearable device communicatively coupled to the user device.
- In Example 21, the subject matter of Examples 19-20 includes, wherein analyzing the motion includes using data captured by a neck-worn device related to the motion.
- In Example 22, the subject matter of Examples 19-21 includes, wherein the baseline metric represents an average performance of the task among a population.
- In Example 23, the subject matter of Examples 19-22 includes, wherein the baseline metric represents a pre-operative attempt by the patient attempting to perform the task.
- In Example 24, the subject matter of Examples 19-23 includes, wherein the task includes walking, and wherein analyzing the motion includes determining a gait characteristic.
- In Example 25, the subject matter of Example 24 includes, wherein analyzing the motion includes using a gait model generated from a skeletal model of the patient based on pre-operative images of the patient wearing a plurality of colored markers captured by the camera.
- In Example 26, the subject matter of Examples 24-25 includes, wherein analyzing the motion includes using a gait model generated from pre-operative images of the patient wearing a suit having a pattern of markers.
- In Example 27, the subject matter of Examples 19-26 includes, determining whether performing the task caused the patient pain based on facial expressions of the patient in the captured images while the task was performed, and wherein presenting the indication further includes presenting information related to the pain.
- In Example 28, the subject matter of Examples 19-27 includes, determining whether performing the task caused the patient pain based on a detected increase in heart rate of the patient, captured by a wearable device communicative coupled to the user device, while the task was performed, and wherein presenting the indication further includes presenting information related to the pain.
- In Example 29, the subject matter of Examples 19-28 includes, wherein the qualitative result includes adherence information related to the task and wherein the qualitative result is based on a milestone.
- In Example 30, the subject matter of Examples 19-29 includes, wherein the qualitative result is sent to a member of a care team for the patient.
- Example 31 is a method comprising: capturing range of motion information using a sensor of a wrist-worn device of a patient attempting to perform a task, the range of motion information captured after completion of an orthopedic surgery on the patient; determining a stress response, based on a heart rate measured by the sensor, as a proxy for pain during arm motion of a patient; identifying a pain location based on a position of the wrist-worn device at a moment of the stress response; presenting, on a display of the wrist-worn device, an indication related to the pain location including a qualitative result.
- In Example 32, the subject matter of Examples 19-31 includes, analyzing, using a processor, the pain location at the moment to generate a movement metric; and comparing, using the processor, the movement metric of the patient to a baseline metric for the task; and wherein the qualitative result corresponds to the comparison.
- In Example 33, the subject matter of Examples 31-32 includes, wherein presenting the indication related to the pain location includes providing a picture or animation of the range of motion, the picture or animation indicating the pain location.
- In Example 34, the subject matter of Examples 31-33 includes, automatically identifying the task without patient input and initiating the range of motion capture in response to automatically identifying the task without patient input.
- In Example 35, the subject matter of Examples 31-34 includes, wherein the qualitative result indicates that the range of motion includes a potentially harmful movement.
- In Example 36, the subject matter of Example 35 includes, alerting the patient to the potentially harmful movement using haptic feedback on the wrist-worn device or audio output by a speaker of the wrist worn device.
- In Example 37, the subject matter of Examples 31-36 includes, determining that the patient is sleeping on a wrong side, and alerting the patient using the wrist-worn device.
- Example 38 is a method comprising: capturing pre-operative video of a patient; identifying a pre-operative gait of the patient based on walking movement performed by the patient in the pre-operative video; determining a gait type by comparing the pre-operative gait to a plurality of stored gaits; generating an orthopedic intervention plan for the patient based on the gait type; and outputting information indicative of the intervention plan for display.
- Example 39 is at least one machine-readable medium including instructions that, when executed by processing circuitry, cause the processing circuitry to perform operations to implement of any of Examples 1-38.
- Example 40 is an apparatus comprising means to implement of any of Examples 1-38.
- Example 41 is a system to implement of any of Examples 1-38.
- Example 42 is a method to implement of any of Examples 1-38.
- Method examples described herein may be machine or computer-implemented at least in part. Some examples may include a computer-readable medium or machine-readable medium encoded with instructions operable to configure an electronic device to perform methods as described in the above examples. An implementation of such methods may include code, such as microcode, assembly language code, a higher-level language code, or the like. Such code may include computer readable instructions for performing various methods. The code may form portions of computer program products. Further, in an example, the code may be tangibly stored on one or more volatile, non-transitory, or non-volatile tangible computer-readable media, such as during execution or at other times. Examples of these tangible computer-readable media may include, but are not limited to, hard disks, removable magnetic disks, removable optical disks (e.g., compact disks and digital video disks), magnetic cassettes, memory cards or sticks, random access memories (RAMs), read only memories (ROMs), and the like.
Claims (20)
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US16/851,606 US20200335222A1 (en) | 2019-04-19 | 2020-04-17 | Movement feedback for orthopedic patient |
| US18/603,842 US12451258B2 (en) | 2019-04-19 | 2024-03-13 | Movement feedback for orthopedic patient |
Applications Claiming Priority (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US201962836338P | 2019-04-19 | 2019-04-19 | |
| US201962853425P | 2019-05-28 | 2019-05-28 | |
| US202062966438P | 2020-01-27 | 2020-01-27 | |
| US16/851,606 US20200335222A1 (en) | 2019-04-19 | 2020-04-17 | Movement feedback for orthopedic patient |
Related Child Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US18/603,842 Division US12451258B2 (en) | 2019-04-19 | 2024-03-13 | Movement feedback for orthopedic patient |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| US20200335222A1 true US20200335222A1 (en) | 2020-10-22 |
Family
ID=72832857
Family Applications (2)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US16/851,606 Abandoned US20200335222A1 (en) | 2019-04-19 | 2020-04-17 | Movement feedback for orthopedic patient |
| US18/603,842 Active US12451258B2 (en) | 2019-04-19 | 2024-03-13 | Movement feedback for orthopedic patient |
Family Applications After (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US18/603,842 Active US12451258B2 (en) | 2019-04-19 | 2024-03-13 | Movement feedback for orthopedic patient |
Country Status (1)
| Country | Link |
|---|---|
| US (2) | US20200335222A1 (en) |
Cited By (19)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20220028546A1 (en) * | 2020-07-24 | 2022-01-27 | International Business Machines Corporation | Assessing the gait of parkinson's patients |
| US11337649B2 (en) | 2016-10-31 | 2022-05-24 | Zipline Medical, Inc. | Systems and methods for monitoring physical therapy of the knee and other joints |
| US20220223255A1 (en) * | 2021-01-13 | 2022-07-14 | Medtech S.A. | Orthopedic intelligence system |
| US20220392082A1 (en) * | 2021-06-02 | 2022-12-08 | Zimmer Us, Inc. | Movement tracking |
| US20230033093A1 (en) * | 2021-07-27 | 2023-02-02 | Orthofix Us Llc | Systems and methods for remote measurement of cervical range of motion |
| US20230067403A1 (en) * | 2021-08-31 | 2023-03-02 | International Business Machines Corporation | Patient monitoring and treatment using electronic textiles |
| EP4177903A1 (en) | 2021-10-13 | 2023-05-10 | Zimmer, Inc. | Body area network having sensing capability |
| US20230170069A1 (en) * | 2019-06-26 | 2023-06-01 | My Medical Hub Corporation | Integrated, ai-enabled value-based care measurement and objective risk assessment clinical and financial management system |
| EP4215105A1 (en) * | 2022-01-24 | 2023-07-26 | Koninklijke Philips N.V. | Automatic pain sensing conditioned on a pose of a patient |
| US11849415B2 (en) | 2018-07-27 | 2023-12-19 | Mclaren Applied Technologies Limited | Time synchronisation |
| US11898874B2 (en) | 2019-10-18 | 2024-02-13 | Mclaren Applied Technologies Limited | Gyroscope bias estimation |
| US20240148536A1 (en) * | 2022-11-07 | 2024-05-09 | VRChat Inc. | Shoulder impingement in virtual reality multiuser application |
| USD1053901S1 (en) | 2021-11-05 | 2024-12-10 | Howmedica Osteonics Corp. | Display screen or portion thereof with graphical user interface |
| USD1067239S1 (en) | 2021-11-05 | 2025-03-18 | Howmedica Osteonics Corp. | Display screen or portion thereof with animated graphical user interface |
| US12315637B2 (en) | 2019-06-26 | 2025-05-27 | My Medical Hub Corporation | Integrated, AI-enabled value-based care measurement and objective risk assessment clinical and financial management system |
| USD1084008S1 (en) | 2021-11-05 | 2025-07-15 | Howmedica Osteonics Corp. | Display screen or portion thereof with graphical user interface |
| US12451258B2 (en) | 2019-04-19 | 2025-10-21 | Zimmer Us, Inc. | Movement feedback for orthopedic patient |
| US12484855B2 (en) | 2020-12-31 | 2025-12-02 | Martin Roche | Pre-operative, intra-operative, and post-operative patient management |
| US12505929B2 (en) | 2025-05-23 | 2025-12-23 | My Medical Hub Corporation | Integrated, AI-enabled value-based care measurement and objective risk assessment clinical and financial management system |
Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20070103471A1 (en) * | 2005-10-28 | 2007-05-10 | Ming-Hsuan Yang | Discriminative motion modeling for human motion tracking |
| US20150320343A1 (en) * | 2013-01-18 | 2015-11-12 | Kabushiki Kaisha Toshiba | Motion information processing apparatus and method |
| US20150325270A1 (en) * | 2013-01-21 | 2015-11-12 | Kabushiki Kaisha Toshiba | Motion information display apparatus and method |
| US20160151013A1 (en) * | 2014-11-27 | 2016-06-02 | Koninklijke Philips N.V. | Wearable pain monitor using accelerometry |
| US9782122B1 (en) * | 2014-06-23 | 2017-10-10 | Great Lakes Neurotechnologies Inc | Pain quantification and management system and device, and method of using |
| US20170344919A1 (en) * | 2016-05-24 | 2017-11-30 | Lumo BodyTech, Inc | System and method for ergonomic monitoring in an industrial environment |
| US20190283247A1 (en) * | 2018-03-15 | 2019-09-19 | Seismic Holdings, Inc. | Management of biomechanical achievements |
Family Cites Families (27)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US8013852B2 (en) | 2002-08-02 | 2011-09-06 | Honda Giken Kogyo Kabushiki Kaisha | Anthropometry-based skeleton fitting |
| US7981057B2 (en) | 2002-10-11 | 2011-07-19 | Northrop Grumman Guidance And Electronics Company, Inc. | Joint motion sensing to make a determination of a positional change of an individual |
| US8684922B2 (en) | 2006-05-12 | 2014-04-01 | Bao Tran | Health monitoring system |
| US8140154B2 (en) | 2007-06-13 | 2012-03-20 | Zoll Medical Corporation | Wearable medical treatment device |
| WO2009111886A1 (en) * | 2008-03-14 | 2009-09-17 | Stresscam Operations & Systems Ltd. | Assessment of medical conditions by determining mobility |
| EP2227057B1 (en) | 2009-03-04 | 2012-12-26 | Fujitsu Limited | Improvements to short-range wireless networks |
| JP2014519944A (en) | 2011-06-20 | 2014-08-21 | ヘルスウォッチ エルティーディー. | Incoherent independent clothing health monitoring and alarm system |
| WO2013096954A1 (en) | 2011-12-23 | 2013-06-27 | The Trustees Of Dartmouth College | Wearable computing device for secure control of physiological sensors and medical devices, with secure storage of medical records, and bioimpedance biometric |
| US10912480B2 (en) | 2013-06-21 | 2021-02-09 | Northeastern University | Sensor system and process for measuring electric activity of the brain, including electric field encephalography |
| GB2533247A (en) | 2013-09-18 | 2016-06-15 | Toshiba Res Europe Ltd | Wireless device and method |
| US20150094559A1 (en) | 2013-09-27 | 2015-04-02 | Covidien Lp | Modular physiological sensing patch |
| US10216904B2 (en) | 2014-04-16 | 2019-02-26 | Carkmh, Llc | Cloud-assisted rehabilitation methods and systems for musculoskeletal conditions |
| US10076286B1 (en) | 2014-10-21 | 2018-09-18 | Verily Life Sciences Llc | Methods and devices for circadian rhythm monitoring |
| WO2016190948A1 (en) | 2015-03-23 | 2016-12-01 | Consensus Orthopedics, Inc. | Joint sensor system and method of operation thereof |
| WO2016154554A1 (en) | 2015-03-26 | 2016-09-29 | Biomet Manufacturing, Llc | Method and system for planning and performing arthroplasty procedures using motion-capture data |
| US10561360B2 (en) | 2016-06-15 | 2020-02-18 | Biomet Manufacturing, Llc | Implants, systems and methods for surgical planning and assessment |
| US10535244B2 (en) | 2017-03-13 | 2020-01-14 | General Electric Company | Patient monitoring system and method for activity tracking |
| US11967422B2 (en) | 2018-03-05 | 2024-04-23 | Medtech S.A. | Robotically-assisted surgical procedure feedback techniques |
| US10743312B2 (en) | 2018-05-09 | 2020-08-11 | General Electric Company | Systems and methods for medical body area network frequency band switching |
| JP2022512254A (en) | 2018-12-13 | 2022-02-02 | リミナル サイエンシズ インコーポレイテッド | Systems and methods for wearable devices for virtually non-destructive acoustic stimuli |
| US20200335222A1 (en) | 2019-04-19 | 2020-10-22 | Zimmer Us, Inc. | Movement feedback for orthopedic patient |
| US20200352441A1 (en) | 2019-05-08 | 2020-11-12 | Orhan Soykan | Efficient Monitoring, Recording, and Analyzing of Physiological Signals |
| MX2021014760A (en) | 2019-06-06 | 2022-01-18 | Canary Medical Inc | Intelligent joint prosthesis. |
| US20210065870A1 (en) | 2019-09-04 | 2021-03-04 | Medtech S.A. | Robotically-assisted surgical procedure feedback techniques based on care management data |
| US12354279B2 (en) | 2021-06-02 | 2025-07-08 | Zimmer Us, Inc. | Movement tracking |
| CN117916812A (en) | 2021-07-16 | 2024-04-19 | 捷迈美国有限公司 | Dynamic sensing and intervention systems |
| EP4177903A1 (en) | 2021-10-13 | 2023-05-10 | Zimmer, Inc. | Body area network having sensing capability |
-
2020
- 2020-04-17 US US16/851,606 patent/US20200335222A1/en not_active Abandoned
-
2024
- 2024-03-13 US US18/603,842 patent/US12451258B2/en active Active
Patent Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20070103471A1 (en) * | 2005-10-28 | 2007-05-10 | Ming-Hsuan Yang | Discriminative motion modeling for human motion tracking |
| US20150320343A1 (en) * | 2013-01-18 | 2015-11-12 | Kabushiki Kaisha Toshiba | Motion information processing apparatus and method |
| US20150325270A1 (en) * | 2013-01-21 | 2015-11-12 | Kabushiki Kaisha Toshiba | Motion information display apparatus and method |
| US9782122B1 (en) * | 2014-06-23 | 2017-10-10 | Great Lakes Neurotechnologies Inc | Pain quantification and management system and device, and method of using |
| US20160151013A1 (en) * | 2014-11-27 | 2016-06-02 | Koninklijke Philips N.V. | Wearable pain monitor using accelerometry |
| US20170344919A1 (en) * | 2016-05-24 | 2017-11-30 | Lumo BodyTech, Inc | System and method for ergonomic monitoring in an industrial environment |
| US20190283247A1 (en) * | 2018-03-15 | 2019-09-19 | Seismic Holdings, Inc. | Management of biomechanical achievements |
Cited By (27)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US11337649B2 (en) | 2016-10-31 | 2022-05-24 | Zipline Medical, Inc. | Systems and methods for monitoring physical therapy of the knee and other joints |
| US11992334B2 (en) | 2016-10-31 | 2024-05-28 | Zipline Medical, Inc. | Systems and methods for monitoring physical therapy of the knee and other joints |
| US11849415B2 (en) | 2018-07-27 | 2023-12-19 | Mclaren Applied Technologies Limited | Time synchronisation |
| US12451258B2 (en) | 2019-04-19 | 2025-10-21 | Zimmer Us, Inc. | Movement feedback for orthopedic patient |
| US12057212B2 (en) * | 2019-06-26 | 2024-08-06 | My Medical Hub Corporation | Integrated, AI-enabled value-based care measurement and objective risk assessment clinical and financial management system |
| US20250285760A1 (en) * | 2019-06-26 | 2025-09-11 | My Medical Hub Corporation | Integrated, ai-enabled value-based care measurement and objective risk assessment clinical and financial management system |
| US12315637B2 (en) | 2019-06-26 | 2025-05-27 | My Medical Hub Corporation | Integrated, AI-enabled value-based care measurement and objective risk assessment clinical and financial management system |
| US20230170069A1 (en) * | 2019-06-26 | 2023-06-01 | My Medical Hub Corporation | Integrated, ai-enabled value-based care measurement and objective risk assessment clinical and financial management system |
| US12476008B2 (en) * | 2019-06-26 | 2025-11-18 | My Medical Hub Corporation | Integrated, AI-enabled value-based care measurement and objective risk assessment clinical and financial management system |
| US11898874B2 (en) | 2019-10-18 | 2024-02-13 | Mclaren Applied Technologies Limited | Gyroscope bias estimation |
| US20220028546A1 (en) * | 2020-07-24 | 2022-01-27 | International Business Machines Corporation | Assessing the gait of parkinson's patients |
| US12484855B2 (en) | 2020-12-31 | 2025-12-02 | Martin Roche | Pre-operative, intra-operative, and post-operative patient management |
| US20220223255A1 (en) * | 2021-01-13 | 2022-07-14 | Medtech S.A. | Orthopedic intelligence system |
| US12354279B2 (en) * | 2021-06-02 | 2025-07-08 | Zimmer Us, Inc. | Movement tracking |
| US20220392082A1 (en) * | 2021-06-02 | 2022-12-08 | Zimmer Us, Inc. | Movement tracking |
| US20230033093A1 (en) * | 2021-07-27 | 2023-02-02 | Orthofix Us Llc | Systems and methods for remote measurement of cervical range of motion |
| US12465281B2 (en) * | 2021-07-27 | 2025-11-11 | Orthofix Us Llc | Systems and methods for remote measurement of cervical range of motion |
| US12191020B2 (en) * | 2021-08-31 | 2025-01-07 | International Business Machines Corporation | Patient monitoring and treatment using electronic textiles |
| US20230067403A1 (en) * | 2021-08-31 | 2023-03-02 | International Business Machines Corporation | Patient monitoring and treatment using electronic textiles |
| EP4177903A1 (en) | 2021-10-13 | 2023-05-10 | Zimmer, Inc. | Body area network having sensing capability |
| USD1084008S1 (en) | 2021-11-05 | 2025-07-15 | Howmedica Osteonics Corp. | Display screen or portion thereof with graphical user interface |
| USD1067239S1 (en) | 2021-11-05 | 2025-03-18 | Howmedica Osteonics Corp. | Display screen or portion thereof with animated graphical user interface |
| USD1053901S1 (en) | 2021-11-05 | 2024-12-10 | Howmedica Osteonics Corp. | Display screen or portion thereof with graphical user interface |
| WO2023138939A1 (en) | 2022-01-24 | 2023-07-27 | Koninklijke Philips N.V. | Automatic pain sensing conditioned on a pose of a patient |
| EP4215105A1 (en) * | 2022-01-24 | 2023-07-26 | Koninklijke Philips N.V. | Automatic pain sensing conditioned on a pose of a patient |
| US20240148536A1 (en) * | 2022-11-07 | 2024-05-09 | VRChat Inc. | Shoulder impingement in virtual reality multiuser application |
| US12505929B2 (en) | 2025-05-23 | 2025-12-23 | My Medical Hub Corporation | Integrated, AI-enabled value-based care measurement and objective risk assessment clinical and financial management system |
Also Published As
| Publication number | Publication date |
|---|---|
| US20240212866A1 (en) | 2024-06-27 |
| US12451258B2 (en) | 2025-10-21 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US12451258B2 (en) | Movement feedback for orthopedic patient | |
| US10973439B2 (en) | Systems and methods for real-time data quantification, acquisition, analysis, and feedback | |
| US11679300B2 (en) | Systems and methods for real-time data quantification, acquisition, analysis, and feedback | |
| US12329516B2 (en) | Systems and methods for assessing gait, stability, and/or balance of a user | |
| US10352962B2 (en) | Systems and methods for real-time data quantification, acquisition, analysis and feedback | |
| US11037369B2 (en) | Virtual or augmented reality rehabilitation | |
| US20200261023A1 (en) | Ascertaining, Reporting, and Influencing Physical Attributes And Performance Factors of Athletes | |
| KR102290504B1 (en) | Health Care System Using Tracker | |
| US20220409098A1 (en) | A wearable device for determining motion and/or a physiological state of a wearer | |
| AU2024213318A1 (en) | Systems and methods for sensor-based, digital patient assessments | |
| JP6871708B2 (en) | Methods, systems, programs, and computer devices for identifying the causative site of compensatory movements, and methods and systems for eliminating compensatory movements. | |
| KR102321445B1 (en) | Health Care System For Medical Devices Using Knee Tracker | |
| US11179065B2 (en) | Systems, devices, and methods for determining an overall motion and flexibility envelope | |
| Bisio et al. | Towards IoT-based eHealth services: A smart prototype system for home rehabilitation | |
| US20190117129A1 (en) | Systems, devices, and methods for determining an overall strength envelope | |
| KR20230094847A (en) | System for smart insole-based human motion inference and method thereof | |
| Ali et al. | Gait analysis in neurologic disorders: methodology, applications, and clinical considerations | |
| CN113397530B (en) | Intelligent correction system and method capable of evaluating knee joint function | |
| Russell | Wearable inertial sensors and range of motion metrics in physical therapy remote support | |
| Adinarayanan | Real-time Assessment and Visual Feedback for Patient Rehabilitation Using Inertial Sensors | |
| KR20210027566A (en) | Knee Health Care System |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| AS | Assignment |
Owner name: ZIMMER US, INC., INDIANA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:WINTERBACH, DALTON;VANDERPOOL, MATT;PALM, KELLI;AND OTHERS;SIGNING DATES FROM 20200420 TO 20200426;REEL/FRAME:052908/0019 |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
| STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |