WO2025053788A1 - Upper limb assistive system and method - Google Patents
Upper limb assistive system and method Download PDFInfo
- Publication number
- WO2025053788A1 WO2025053788A1 PCT/SG2024/050562 SG2024050562W WO2025053788A1 WO 2025053788 A1 WO2025053788 A1 WO 2025053788A1 SG 2024050562 W SG2024050562 W SG 2024050562W WO 2025053788 A1 WO2025053788 A1 WO 2025053788A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- subject
- upper limb
- support
- shoulder
- recited
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61H—PHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
- A61H1/00—Apparatus for passive exercising; Vibrating apparatus; Chiropractic devices, e.g. body impacting devices, external devices for briefly extending or aligning unbroken bones
- A61H1/02—Stretching or bending or torsioning apparatus for exercising
- A61H1/0274—Stretching or bending or torsioning apparatus for exercising for the upper limbs
- A61H1/0281—Shoulder
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61H—PHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
- A61H2201/00—Characteristics of apparatus not provided for in the preceding codes
- A61H2201/50—Control means thereof
- A61H2201/5007—Control means thereof computer controlled
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61H—PHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
- A61H2201/00—Characteristics of apparatus not provided for in the preceding codes
- A61H2201/50—Control means thereof
- A61H2201/5058—Sensors or detectors
- A61H2201/5069—Angle sensors
Definitions
- This application relates to rehabilitation and assistance, and more particularly to an upper limb assistive system and an upper limb assistive method.
- the system may assist patients in self-conducting ADL (Activities of Daily Living) tasks and their interaction with real objects.
- ADL Actively of Daily Living
- an upper limb assistive system may comprise: a support member for supporting an upper limb of a subject; an actuator coupled to the support member to move the support member; and a computing module in signal communication with the actuator, the computing module comprising a processing unit; and a non-transitory media having media storing instructions readable by the processing unit, the media storing instructions that when executed by the processing unit, causes the processing unit to: receive a first support position and a first support orientation of the support member in an actuator frame, the support member being coupled to the upper limb of the subject; determine a shoulder position of the subject based on at least one anthropometric measurement of the subject; based on the shoulder position of the subject, transform the first support position and the first support orientation in the actuator frame to a corresponding second support position and a corresponding second support orientation in a subject shoulder frame, wherein the subject shoulder frame corresponds to the shoulder position of the subject; estimate a plurality of upper limb joint angles corresponding to
- an upper limb assistive method comprises: receiving a first support position and a first support orientation of a support member in an actuator frame, the support member being coupled to an upper limb of a subject; determining a shoulder position of the subject based on at least one anthropometric measurement of the subject; based on the shoulder position of the subject, transforming the first support position and the first support orientation in the actuator frame to a corresponding second support position and a corresponding second support orientation in a subject shoulder frame, wherein the subject shoulder frame corresponds to the shoulder position of the subject; estimating a plurality of upper limb joint angles corresponding to a plurality of upper limb parts, based on the corresponding second support position and the corresponding second support orientation using a human inverse kinematics model corresponding to the subject; and determining a gravity-compensated force on the support member based on the plurality of upper limb joint angles.
- FIG. 1 is a schematic diagram of an upper limb assistive system according to embodiments of the present disclosure
- FIG. 2 shows an exemplary actuator according to various embodiments
- FIG. 3 is a process flow diagram of the upper limb assistive system according to various embodiments.
- FIG. 4 is another process flow diagram of the upper limb assistive system according to various embodiments.
- FIG. 5 is a perspective schematic view' of a subject and a sagittal plane according to various embodiments
- FIG. 6 is a front view of FIG. 5;
- FIG. 7 is a schematic showing an upper limb of a subject and a proposed ARAE robotic system.
- the upper limb or human arm is modeled as a four degree of freedom link mechanism, including 3 revolute joints at the shoulder joint and 1 revolute joint at the elbow joint (E) under the human shoulder base coordinate Os-
- the human shoulder base frame is denoted as ⁇ Os ⁇ :Os _ xsyszs.
- S, E and W represent the position vector of the shoulder joint, elbow joint, and wrist joint under the human shoulder base coordinate.
- FIG. 8 is a system schematic map of the ARAE system of FIG. 7; [0015] FIG. 9 shows a system control framework demonstrating the interaction between the subject/human and the ARAE robotic system;
- FIG. 10 is a schematic diagram for obtaining the shoulder position for a sagittal plane model
- FIG. 11 shows a subject using the ARAE system
- FIG. 12 shows a proposed system for evaluation
- FIG. 13 shows the placement of sEMG and makers on the subject
- FIG. 14 is a plot showing group classification based on the distance from the realtime elbow joint to the calibrated shoulder joint as the percentage of the actual upper limb length.
- FIG. 15 shows a comparison between Mocap measured angles and estimated joint angles derived from the Fixed torso model and Sagittal plane model
- FIG.16 shows a performance comparison of two models (fixed torso model and sagittal plane model) on four types of motion patterns
- FIG. 17A shows an example of the sEMG profile for the Biceps Brachii (BB) in one subject while performing the ’Forward Reaching’ task.
- FIGs. 17B and 17C illustrate the net changes in EMG for four muscle activation when transitioning from No Robot mode to With Robot mode, under the fixed torso model (Exp2-2) and the sagittal plane model (Exp2-3), respectively;
- FIG. 18 is a schematic diagram illustrating an upper limb assistive method according to various embodiments of the present disclosure.
- FIG. 19 shows a block diagram of a processing system for implementing embodiments of the present disclosure.
- the articles “a”, “an” and “the” as used with regard to a feature or element include a reference to one or more of the features or elements.
- the term “about” or “approximately” as applied to a numeric value encompasses the exact value and a reasonable variance as generally understood in the relevant technical field, e.g., within 10% of the specified value.
- the term “pose” may refer to a position and an orientation of an object in a frame.
- the term “position” may refer to a location or coordinate (for example, X-coordinate, Y coordinate, Z coordinate) of an object or part of an object in a space or a frame.
- the term “orientation” may refer to a facing or angle (for example, a X-direction vector, a Y-direction vector, a Z-direction vector) of an object or part of an object in a space or a frame.
- Each of the terms “pose”, “position” and “orientation” may be defined by a coordinate system, such as a cartesian coordinate.
- the term “frame” may refer to a set of coordinates in which measurements of sizes, pose, positions, orientations, motions, etc. may be made.
- anthropometric measurement refers to quantitative measurements of the body of a subject.
- the anthropometric measurement may be non-invasive in nature, used to assess the size, proportions and composition of the human body.
- Examples of anthropometric measurement may include height, recumbent length, circumferences (head, waist, hip, mid-upper arm, mid-thigh, calf, chest, neck), limb lengths (arm-span, demi-span, half-span), abdominal sagittal diameter, etc.
- end-effector type systems may be suitable for providing upper limb assistance for a subject.
- one limitation of end-effector type systems includes bulkiness which limits the feasibility for homebased therapy.
- the need for attaching sensors to the subject in measuring the upper limb joint angles adds bulk and hinders movement of the subject’s upper limb.
- Non-contact sensors, such as image-based sensors, which does not add onto the bulk of the system often experience inaccuracies during the measurement and are typically expensive.
- the assistive system may provide gravity compensation (GC) forces to the upper limb.
- the GC forces may reduce the muscular effort of the subject, enhancing the transparency of movement during rehabilitation.
- one challenge with providing GC forces for end-effector type systems is the determination of the GC forces during operation.
- the GC forces may change responsive to the upper limb position and/or orientation (or pose) in three-dimensional space.
- Conventional end-effector type systems measure the human joints angles during movements with wearable magnetic sensors and compute the support force based on the human dynamics model.
- the wearable magnetic sensors are often cumbersome intervening in the subject’s movement and introduce complexity during operation.
- the present disclosure proposes an upper limb assistive system and an upper limb assistive method.
- the proposed upper limb assistive system is mobile, light weight and does not require attaching sensors on the subject during rehabilitation.
- the proposed assistive system may concurrently perform motion tracking as well as provide a gravity-compensated (GC) force to fully support the upper limb of the subject.
- GC gravity-compensated
- the proposed upper limb assistive system and method may be used to provide a gravity-compensated (GC) force or supporting force to a subject or patient during rehabilitation, such as during physiotherapy.
- the proposed upper limb assistive system and method may be used to provide a gravity- compensated (GC) force or supporting force to a subject when performing an upper limb activity, such as using a touch screen device.
- GC gravity-compensated
- the proposed upper limb assistive system may determine various upper limb joint angles, such as the shoulder angles and the elbow angle.
- the proposed system may estimate or determine the upper limb joint angles based on a position and orientation of a forearm of the subject.
- the position and orientation of the forearm of the subject may be determined based on a position and an orientation of a support member coupled to the forearm.
- the system may determine and provide a gravity- compensated force or a supporting force to support the arm of the subject during movement/activities.
- FIG. 1 illustrates an upper limb assistive system 100 for providing assistance or support to a subject 90, according to various embodiments of the present disclosure.
- the system 100 may be a personalized system 100 customizable according to each user/subject. For example, the system 100 may be configured or customized based on anthropometric measurements of the subject 90.
- the system 100 may include an actuator 110 with a base 112 which acts as a reference defining an actuator frame ⁇ ORO ⁇ .
- the actuator 110 may further include an actuator arm 114 coupling a support member 116 to the base 112.
- the actuator 110 may be configured to move the support member 116 or provide a force to the support member 116 via the actuator aim 114.
- the actuator 110 may be in signal communication with a controller or a computing module 120.
- the computing module 120 may control the actuator 110 to vary a support pose (comprising a support position and a support orientation) of the support member 116.
- the support member 116 may be coupleable or attachable to an upper limb 91 of a subject 90.
- the support member 116 may support the upper limb 91 of the subject 90 during activity.
- the upper limb 91 may include a plurality of upper limb parts such as shoulder joint 92, an upper arm 93, an elbow joint 94, a forearm 95, a wrist joint 96, and a hand 97.
- the support member 116 may be suitably shaped/formed to conform to a surface or a contour of the upper limb 91 or a part of the upper limb 91, such as the upper arm 93.
- the support member 116 may be a forearm cuff, an upper arm cuff or a splint.
- the support member 116 may be attached to the forearm 95 of the subject 90.
- the support member 116 may be attached between the elbow joint 94 and the wrist joint 96 of the subject 90.
- the support member 116 may be attached to the forearm 95 such that the support pose (comprising the support position and the support orientation) of the support member 1 16 is representative or substantially representative of a forearm pose (forearm position and forearm orientation) of the forearm 95 of the subject 90.
- straps of a forearm cuff may be provided to secure the forearm 95 of the subject
- the support member 116 may be attached to the upper arm 93 of the subject 90.
- the support member 116 may also be attached between the shoulder joint 92 and the elbow joint 94.
- the actuator 110 may include a base 112 coupled to an actuator arm 114 configured as a linkage structure 114.
- the linkage structure 114 may further be coupled to the support member 116.
- the linkage structure 114 may include a plurality of linkage members operably coupled via a plurality of joints 1 15/1 17.
- the plurality of joints may include active joints 115 or actuated joints as well as passive joints 117. Active joints 115 are actuatable joints and passive joints 117 are non- actuatable joints.
- each of the active joints 115 may be driven by a respective driving member, such as a quasi-direct drive motor.
- Quasi-direct drive motors may provide a high torque and backdrivability, which enhances the performance of physical humanrobot interaction (pHRI).
- each of the active joints may be driven by a respective Series Elastic Actuator (SEA) which provides high backdrivability despite having limitations to the bandwidth.
- SEA Series Elastic Actuator
- the system 100 may include sensors for measuring a support pose (comprising a support position and a support orientation) of the support member 1 16 in the actuator frame ⁇ ORO ⁇ .
- the sensors may be encoders coupled to each of the joints, active joints and passive joints.
- the sensors may be integrated with each of the driving member or motors for measuring the pose of the support member 116 in the actuator frame ⁇ ORO ⁇ .
- the linkage structure 114 may be a four-bar linkage structure which includes three active joints 1 15 driven by quasi-direct drive motors and two passive revolute joints 117.
- a mechanical limit 118 may be provided to limit the range of motion (ROM) of the actuator 110 and the upper limb 91 of the subject 90.
- the computing module 120 may be configured to receive measurement data from the sensors to determine a first support pose 210 in the actuator frame ⁇ ORO ⁇ , or in other words, a first support position 210a and a first support orientation 210b of the support member 116 in the actuator frame ⁇ ORO ⁇ .
- the support member 116 may be coupled to the forearm 95 of the subject 90, and movable/displaceable by both the subject 90 and the actuator 110.
- the first support pose 210 may be reflective or representative of the forearm pose (forearm position and forearm orientation) of the forearm 95 of the subject 90 in the actuator frame ⁇ ORO ⁇ .
- the first support position 210a and the first support orientation 210b of the support member 216 corresponds to the elbow position 94 and the wrist position 96 of the subject 90 respectively.
- the first support position 210a and the first support orientation 210b of the support member 216 corresponds to the position and the orientation of the forearm 95 of the subject 90.
- the computing module 120 may determine the first support position 210a and the first support orientation 210b based on relative position between the plurality of linkage members. For example, the first support position 210a and the first support orientation 210b may be determined based on each of the joint angles of the various joints (active joints and passive joints). It may be appreciated that at the present step, the first support position 210a and the first support orientation 210b of the support member 116 may be measured with reference to the actuator frame ⁇ ORO ⁇ . In other embodiments, without the need for separate sensors, the actuator 110 may provide the first support position 210a and the first support orientation 210b directly to the computing module 120.
- the computing module 120 may determine a shoulder position 220 of the subject 90 based on one or more anthropometric measurements of the subject 90.
- the shoulder position 220 may be represented as a position in the actuator frame ⁇ ORO ⁇ .
- the shoulder position 220 may be a fixed position in the actuator frame ⁇ ORO ⁇ .
- the anthropometric measurements may include a length of the upper arm 93, a length of the forearm 95, a torso width of the subject 90, a distance between a hip and a shoulder of the subject 90, etc.
- the computing module 120 may transform the first support pose 210 in the actuator frame ⁇ ORO ⁇ to a corresponding second support pose 230 in a subject shoulder frame ⁇ Os ⁇ -
- the first support position 210a and the first support orientation 210b in the actuator frame ⁇ ORoJ may be transformed to a second support position 230a and a second support orientation 230b in the subject shoulder frame ⁇ Os ⁇ .
- the second support position 230a corresponds to the first support position 210a.
- the second support orientation 230b corresponds to the first support orientation 210b.
- the subject shoulder frame ⁇ Os ⁇ may correspond to the shoulder position 220 of the subject.
- an origin of the subject shoulder frame ⁇ Os ⁇ may be positioned at the shoulder position 220.
- the computing module 120 may estimate or determine the upper limb joint angles 240 of the subject 90 corresponding to the upper limb parts, based on the second support position 230a and the second support orientation 230b.
- the upper limb joint angles 240 may include multiple shoulder angles such as shoulder abduction/adduction angle, shoulder flexion/extension angle, shoulder internal/external rotation angle, and elbow angle such as elbow flexion/extension angle.
- the human inverse kinematics model may include inputs of anthropometric measurements of the subject 90.
- the computing module 120 may determine a gravity- compensated (GC) force 250 on the support member 116 based on the plurality of upper limb joint angles 240. Further, the computing module 120 may control or actuate the actuator 110 to provide the GC force to the upper limb 91 of the subject 90 via the support member 116. The GC force 250 may hold the upper limb 91 of the subject 90 to provide an upper limb support to the subject. In various embodiments, for full upper limb assistance, the GC force 250 may include a magnitude such that the all weight of the upper limb 91 of the subject 90 is supported by the support member 116.
- GC gravity- compensated
- the GC force 250 may include a magnitude such that only a portion of the weight of the upper limb 91 of the subject 90 is supported by the support member 116. It may be appreciated that the above-described process may be iteratively performed to provide the GC forces 250 to the subject 90 in the course of activity or rehabilitation.
- the computing module 120 may determine the shoulder position 220 based on a predetermined and fixed position.
- the shoulder position 220 may be a fixed position in relative to the actuator 110, or in other words, the shoulder position 220 may be fixed in relative to the actuator frame ⁇ ORO ⁇ .
- the computing module 120 may determine the shoulder position 220 based on at least one anthropometric measurement such as a torso length of the subject 90.
- a primary limitation of this approach is that in practical scenarios, the shoulder position 220 typically shifts in conjunction with torso movements thus introducing some inaccuracy to the method.
- the computing module 120 may transform the first support pose 210 (the first support position 210a and the first support orientation 210b) in the actuator frame ⁇ ORO ⁇ to a corresponding intermediate support pose 211 (an intermediate support position 211a and an intermediate support orientation 211b) in a subject pelvis frame ⁇ Op ⁇ .
- the computing module 120 may transform the first support pose 210 to the corresponding intermediate support pose 21 1 based on one or more anthropometric measurements of the subject 90 and the sagittal plane 80 in the subject pelvis frame ⁇ Op ⁇ .
- the sagittal plane 80 may be defined by a hip joint position 98 of the subject.
- the sagittal plane 80 may be tangent to the hip joint position 98.
- the computing module 120 may form a first arc 213 in the sagittal plane 80.
- the first arc 213 may be formed with a first center and a first radius.
- the first center may be the hip joint position 98
- the first radius may be a first anthropometric measurement, such as distance R1 between a hip joint position 98 and a shoulder joint 92 of the subject 90.
- computing module 120 may form a second arc 215 in the sagittal plane 80.
- the second arc 215 may be formed with a second center and a second radius.
- the second center may be a projected elbow joint 99 in the sagittal plane 80, and the second radius may be a second anthropometric measurement, such as a distance R2 between the projected elbow joint 99 and the shoulder joint 92 of the subject 90.
- the computing module 120 may determine the shoulder position 220 based on an intersection of the first arc 213 and the second arc 215 in the sagittal plane 80.
- the computing module 120 may, transform the intermediate support pose 211 (the intermediate support position 211a and the intermediate support orientation 211b) in the subject pelvis frame ⁇ Op ⁇ to a corresponding second support pose 230 (the second support position 230a and the second support orientation 230b) in a subject shoulder frame ⁇ Os ⁇ .
- the computing module 120 may then estimate or determine the upper limb joint angles 240 of the subject 90 corresponding to the upper limb parts, based on the second support position 230a and the second support orientation 230b, using a human inverse kinematics model corresponding to the subject 90.
- a gravity-compensated (GC) force 250 on the support member 116 based on the plurality of upper limb joint angles 240 and to actuate the actuator 1 10 to provide the GC force to the upper limb 91 of the subject 90 via the support member 116.
- GC gravity-compensated
- FIG. 7 shows a schematic of the proposed upper limb assistive system 100, otherwise known as the Assistive Robotic Arm Extender (ARAE).
- FIGs. 8 and 9 are system block diagrams of the ARAE.
- the ARAE is configured to provide arm support in three-dimensional (3D) space for functional task training.
- the ARAE is capable of achieving high transparency in movement within 3D space and offers adaptive arm support based on estimated human postures. Configurations of the ARAE allows the ARAE to assist patients with Muscle Manual Testing (MMT) scores ranging from 1 to 4, in performing ADLs and interacting with actual environments.
- MMT Muscle Manual Testing
- the ARAE is provided with Quasi direct drive (QDD) motors, encoders, and a parallel mechanism, incorporating three active Degrees-of-freedom (DOFs) and two passive DOFs.
- QDD Quasi direct drive
- the ARAE system includes an adaptive control framework for gravity compensation on the human arm, via determining a compensatory force based on the determined estimation of the subject’s postures.
- the proposed ARAE system comprises two components: i) Human joint angle estimation and 2) Upper limb support force determination.
- the ARAE comprises a 3-DOF actuated robotic arm or actuator 110 with a base 112 and a linkage structure 114.
- An end-effector module or support member 116 couples the linkage structure 114 via two passive joints.
- Motor 1 (motor 115a) mounted on the base 112 enables rotary movement of the linkage structure 114 at the base 112.
- Motor 2 (motor 115b) and Motor 3 (motor 115c) are positioned atop Motor 1.
- Motor 2 drives Link 1 (link 114a) and Motor 3 drives Link 2 (link 114b).
- Mechanical limits may be provided to act as the protective limit for Motors 2 and 3.
- the Links 1 to 4 are fabricated from carbon fibre tubes, and hence configured with high stiffness and minimal weight. This enables a compact configuration, optimizing the transmission of actuator torque from the robotic arm to the subject.
- the end-effector module 116 is attached to Link 4, secured thereto with a clamping mechanism on Link 4 and an antirotation screw.
- two encoders 1 17 with 2-DOF movement are provided coupled to the end-effector module 116, with the respective output shafts of the encoders 117 serving as the rotation axes.
- the end-effector module 116 includes with a forearm cuff, to securely hold the subject’s forearm 95 while allowing slight rotational adjustment following the curve beneath the cuff.
- the ARAE includes three QDD motors (TMotor AK10-9 V2.0) to provide the 3-DOF movement with each of the QDD motors with a peak torque of 48Nm.
- the maximum external load applicable at the end-effector module is 12.43kg when the parallel mechanism extends to a position of maximum working range. At this position, the end-effector load applies the largest moment on the motors.
- the peak loads at the end-effector may support the arm weight of 99% of human subjects and facilitate user activity.
- the ARAE is suitably configured for a variety range of human body and strength training.
- the maximum joint speed of 26 rad/s provided is approximately 8 times higher than the speeds typically required in ADL tasks.
- the back drive torque is 0.8Nm generating high performance of transparency during human-robot interaction (HRI).
- HRI human-robot interaction
- an STM32F429 (STMicroelectronics) microcontroller acts as the embedded system to communicate with the encoders and T-Motors through the CAN (Controller Area Network) bus while connected with the Linux application through a serial port.
- the MCU communicates with the external ADC chip AD7606 through SPI (Serial Peripheral Interface).
- SPI Serial Peripheral Interface
- the sampling frequency of the encoder feedback and the low-level actuator control loop is fixed at 1kHz.
- a computer running Linux is used as a host PC for the data logging and user interface.
- the communication between MCU and the host PC is written in C++ as nodes for the robot operating system (ROS).
- the PC -based Graphical User Interface (GUI) developed in C++, facilitates real-time monitoring of the ARAE’s operational status.
- the joint angles and structural parameters are assigned to each joint and link. Since the main drive mechanism of ARAE is a parallelogram structure, the kinematic model can be simplified as follows serial-link mechanism, represented by the dashed line in FIG. 7. The Denavit-Hartenberg (D-H) algorithm is applied to derive the kinematics model.
- D-H Denavit-Hartenberg
- D-H Denavit-Hartenberg
- the vector of end-effector position is R 4xl under robot based frame, which follows the Equation.4.
- the two endpoints of the cuff are denoted as in the robot base coordinate system, which can be defined as: where ps, pe and p, represent the 5th, 6th, and 7th joint positions, respectively, in the specific local coordinates.
- the transformation matrices 5 T 6 and 6 T 7 represent the transformations from the 5th joint to the 6th joint and from the 6th joint to 7th joint, respectively.
- the Jacobin matrix maps the first- order differential relationship between active joints and the position of joint 4 (modified endeffector position) R p3 in Cartesian space.
- the robot Jacobian matrix is represented by J R E R 3X3 .
- individual elements Jr is as follows: where where
- the inverse kinematics (IK) model facilitates the achievement of a fully passive control mode for the ARAE. This allows the robot to manoeuvre the patient’s arm to a predefined position without necessitating any muscular effort on the patient’s part.
- the general inverse kinematics model can yield an infinite set of solutions represented by q t . Therefore, the inverse kinematics (IK) model is constrained to only computing the active joint angles using the R ps input.
- the R ps is denoted as [x, y, z] T .
- the following equations are used to solve three active joints with the input of [x, y, z] 1 :
- the dynamic model of the ARAE achieves gravity compensation of the mechanical structure.
- the gravity compensation feature of the ARAE enables the entire system to operate in “transparent mode”, necessitating only minimal externally applied force from the subject.
- the robotic system is capable of maintaining a stable hover at the target position upon withdrawal of the external force.
- the Mu(qt) represents the mass (inertia) matrix of robot
- the Cx(qi, q i) refers to the Coriolis and centripetal matrix
- the Gx(qi) is the gravity vector
- TR is the required joint vectors of three motors. Assuming the absence of inertia in the robot system, only the gravity term Gn(qi) is considered for calculating the compensated joint torques.
- the ARAE provides adaptive arm gravity compensation means such that the support force provided by the robot at the end-effector varies with the arm posture of the subject.
- the entire control framework reflects the interaction between the human-robot system.
- the system refers to the proposed adaptive GC of the human arm control framework.
- the proposed adaptive gravity compensation of the human arm is represented by the joint angle estimation method and calculation of the human required support force.
- the system may estimate or determine the human joint angles hj based on the P s , P e and P w obtaining from a fixed torso model or a sagittal plane model.
- support force Fh for the human artn/uppcr limb is computed, and the torque Th is provided by the robot.
- TR is the gravity torque of the robot structure.
- the human inverse kinematics model may be used to derive the human joint angles hj, including shoulder abduction/adduction (hi), shoulder flexion/extension (hi), shoulder internal/extemal rotation (ha), and elbow flexion/extension (ha.)-
- the established human inverse kinematics model is given by: where the s pE and s pw are the position vectors input to the human IK model.
- the IF represents the length of the forearm, defined as an initial parameter subject to anthropometric data. An assumption was made that the original point of the shoulder frame under the human shoulder frame is denoted as ps, and it is a fixed position.
- the length from s pE to the fixed shoulder point is calculated as lucai rather than directly using the actual human upper arm length (lu) due to the change of derived s pE- [0078]
- the fixed torso model includes an assumption that the shoulder position remains fixed in the course of movement. In practical scenarios, the shoulder position typically shifts in conjunction with torso movements, particularly during actions like reaching for distant positions. Addressing the above, a Sagittal plane model is proposed and described in the following section.
- E’ is the projection of the elbow joint in the sagittal plane and the hip joint position H is assumed to be located on the sagittal plane and xp axis.
- ISH refers to the initial parameter from the hip to the shoulder.
- This parameter is an anthropometric value, unique to each user’s torso length.
- 1 PH is another parameter referring to half of the torso width specific to different subjects.
- the required support force can be computed using the human arm dynamics model.
- the human arm model is modelled as a link mechanism with four degrees of freedom. The center of mass of these two links is shown as mu and mi,
- the human arm dynamics model can be written as: are the human arm joint angles and derivatives, r is the joint torque generated by the human arm.
- n 4 denotes the four DOFs of the human arm model including three DOFs in the shoulder joint and one DOF in the elbow joint.
- the robot applies a support force F/, to the human arm, forming an external torque R r that primarily compensates for the arm’s gravity term Gh, as follows:
- the required force of the end-effector to support the human arm’s weight can then be calculated using the human aim model as follows: where hj is the estimated human joint angles.
- the J T# h (hj) is the pseudo-inverse of J T h(hj) S R 3x4 and Gh is the gravity term of human arm.
- the lu and IF represent the length of upper limb and forearm.
- the human arm gravity vector G/, (h/) refers to:
- the COM F and COMu refer to the ratio of center of mass point of forearm and upper limb. Moreover, the my and my are the mass of upper limb and forearm. [0090]
- the calculated force Fh varies in both magnitude and direction across the workspace, depending on the human arm joint angles hj. Subsequently, the calculated force Fh is mapped to the compensated torque for human arm Th in robot joint space as:
- the Institutional Review Board of Nanyang Technological University (IRB-2022- 821) approves the experimental protocol. After reviewing the informed consent form, four righthanded healthy subjects (4 males, 29 + 2 years old) were involved in the experiments.
- the mean mass of the participants is 75.05 ⁇ 2.5kg and the mean height is 178 ⁇ 4.23cm.
- the mean upper limb length (lu) is 29.91+0.25cm and the forearm length (IF) is 26.43 + 0.66cm.
- the mean of trunk length (ISH) is 38.50 + 1.04cm and the mean of trunk width (IPH) is 17.93 ⁇ 0.64cm.
- the first experiment was conducted to evaluate the human joint angle estimation methods by the Mocap system.
- the subject wears the ARAE system and sits in the capture volume of the Mocap system.
- Six pre-defined positions are labelled on the table, shown in FIG. 12.
- the subject sits in front of a table and attaches the forearm to the ARAE, with the six labeled positions and one original/starting point located at the experimental table.
- the starting position is marked by the circular label that is closest to the human body.
- the original position is set for resting the arm.
- the label 3 is the farthest position from the body, making the subject do trunk compensation movements in the sagittal plane.
- the rest of the labels are located in a square. All subjects performed five trials for six labelled positions. Each trial involved continuous movements, specifically reaching and drinking activities performed with a real cylinder, effectively simulating ADL (Activities of Daily Living) task training.
- the detailed procedures are illustrated as follows:
- H2M Hand to Mouth
- the markers’ locations recorded by the Mocap system were sampled at 200 Hz. Then, Visual3D - a professional software was used to transfer the marker location into human joint angles and each joint position under the Mocap world frame. Furthermore, the PC logged the corresponding data of the motors and encoders of the ARAE robot at 100 Hz. After the experiments, all the collected data were off-lined and analyzed by MATLAB R2022a. The kinematic data from Visual3D were downsampled to 100 Hz, which can synchronize with the measured data from the robot.
- sEMG Data Preprocessing To evaluate the adaptive arm gravity compensation framework of ARAE on the human body, muscle activities were recorded by the wireless sEMG system (Cometa Picolite, ITALY) at 2000 Hz. As shown in FIG. 13, the EMG electrodes were placed on four upper limb muscles, including Pectoralis Major (PM), Deltoid Medial (DM), Bicep Brachii (BB), and Triceps Brachii (TB) following SENIAM guidelines. At the beginning of each session, each subject needs to do Maximum Voluntary Contraction (MVC) which was later used to normalize the EMG signals.
- PM Pectoralis Major
- DM Deltoid Medial
- BB Bicep Brachii
- TB Triceps Brachii
- the recorded data were offline processing which involved two stages of notch filtering (using an HR notch filter with a cutoff frequency of 50Hz to eliminate powerline interference and another at 1.67Hz to remove heartbeat noise); it was then subjected to high-pass filtering (via a 10th order Butterworth filter with a 20Hz cutoff frequency); the data was rectified by computing the absolute value; and subsequently smoothed with a low-pass filter (a 10th order Butterworth filter at a 4Hz cutoff frequency). All data from the sEMG channels were synchronized with the Mocap and ARAE systems through the DAQ board.
- the percent change of the MAV was conducted to measure the decrease in muscle activities from the control mode of Exp2-1 to Exp2-2 and Exp2-3, respectively.
- the AMAV% can be expressed as: i e ⁇ 2, 3 ⁇ (53)
- the robot joint angles were input into the derived model which output the estimated human joint angles, including shoulder abduction/adduction (SA), shoulder flexion/extension (SF), shoulder intcmal/cxtcmal rotation (SR), and elbow flcxion/cxtcnsion (EF).
- SA shoulder abduction/adduction
- SF shoulder flexion/extension
- SR shoulder intcmal/cxtcmal rotation
- EF elbow flcxion/cxtcnsion
- FIG. 15 shows the angle estimation results for the label 3 of subject 4 as well as a comparison of the changes between the estimated angles of two models and Mocap measured angles (assumed as Ground truth).
- Results were obtained from the label 3 position of Subject 4, demonstrated the change of four joint angles in 30 seconds.
- subfigurc (a) shows Shoulder Abduction/Adduction, subfigure (b) Shoulder Flexion/Extension, subfigure (c) Shoulder Internal/External rotation, subfigure (d) Elbow Flexion, subfigure (e) and (f) The derived shoulder position in the y-axis, which corresponds to the data at the circled position in (a) and (b).
- the sagittal plane model can derive a more accurate shoulder position p s , which in turn will result in relatively precise estimated joint angles when the torso has significant movements, especially in the case of an abrupt change in the slope of the angle.
- the ARAE as proposed robot is compact, portable, and easy to set up, which is suitable for home-based therapy.
- the proposed adaptive arm support control framework can provide the support force with different human arm poses, offering simple implementation and adaptability to diverse users.
- the sagittal plane model is more suitable for estimating joint angles during torso movements.
- the shoulder undergoes significant movements in the sagittal plane (FIG. 14)
- the accuracy of both models decreases as the torso moves forward.
- the sagittal plane model significantly improves the angle estimation accuracy compared to the fixed torso model.
- This enhancement can make our entire framework more universally applicable, particularly during torso movements.
- numerous individuals who have suffered from strokes display an excessive use of compensatory trunk motions while reaching and placing objects, which affect the recovery in stroke patients. Therefore, our proposed method can provide accurate joint angle estimation, subsequently enabling the generation of sufficiently precise gravity compensation.
- avoiding the trunk compensatory or torso movements improves upper extremity recovery in stroke patients.
- the benefit of the proposed models provides the capability to monitor the patient’s arm posture during ADL, which is a main feature for clinical assessments of the upper limb.
- external sensors such as magnetic sensors and IMUs, or external RGB-D cameras [31].
- the proposed system and method do not need to rely on external sensors, which reduce the system complexity and enhance the robotic system’s usability.
- One of the observed benefits of the ARAE is the reduction of muscular activity of healthy subjects during simulated ADLs (Exp2). As shown in FIGs. 17A to 17C, the BB activity significantly reduces compared with other muscles. In the Forward Reaching (FR) task, a notable decrease in muscle activity was observed: for the Biceps Brachii (BB), the reduction ranged from -52.64% to -63.75%, and for the Deltoid Muscle (DM), it went from -19.78% to -24.78% when the control mode utilizing the sagittal plane model (Exp2-3) was implemented. [00128] The main reason for this effect may be due to the subjects performing trunk compensatory maneuvers during the FR task because position 3 is located in the farthest position.
- the sagittal plane model is able to better derive the changing shoulder joint position, thus obtaining more accurate joint angle prediction. In turn, it provides more precise arm support during FR tasks.
- the DM activity has an increase for the LR task when conducting the sagittal plane mode (Exp2-3). This result hypothesizes that the sagittal plane model possesses relatively weak generalization ability when the torso is moving in the later plane. Therefore, the calculated force based on the proposed control framework might produce extra assistive force in some posture to restrict the arm movement during LR task.
- an upper limb assistive method 700 comprises: in 710, receiving a first support position and a first support orientation of a support member in an actuator frame, the support member being coupled to an upper limb of a subject; in 720, determining a shoulder position of the subject based on at least one anthropometric measurement of the subject; in 730, based on the shoulder position of the subject, transforming the first support position and the first support orientation in the actuator frame to a corresponding second support position and a corresponding second support orientation in a subject shoulder frame, wherein the subject shoulder frame corresponds to the shoulder position of the subject; in 740, estimating a plurality of upper limb joint angles corresponding to a plurality of upper limb parts, based on the corresponding second support position and the corresponding second support orientation using a human inverse kinematics model corresponding to the subject; and in 750, determining a gravity-compensated force on the support member based on the
- the method 700 further comprises: transforming the first support position and the first support orientation in the actuator frame to a corresponding intermediate support position and a corresponding intermediate support orientation in a subject pelvis frame; and based on the at least one anthropometric measurement and a sagittal plane in the subject pelvis frame, transforming the intermediate support position and the intermediate support orientation in the subject pelvis frame to the corresponding second support position and the corresponding second support orientation in the subject shoulder frame.
- the method 700 further comprises: forming a first arc in the sagittal plane with a first center and a first anthropometric measurement as a first radius; and forming a second arc in the sagittal plane with a second center and a second anthropometric measurement as a second radius.
- the method 700 further comprises: further comprising: determining the shoulder position based on an intersection of the first arc and the second arc in the sagittal plane.
- FIG. 26 a block diagram representative of components of processing system 1800 that may be provided within controller 120 to carry out the digital signal processing functions or computations in accordance with embodiments of the disclosure, or any other modules or sub-modules of the system is illustrated in FIG. 26.
- processing system 1800 may be provided within controller 120 to carry out the digital signal processing functions or computations in accordance with embodiments of the disclosure, or any other modules or sub-modules of the system.
- FIG. 26 a block diagram representative of components of processing system 1800 that may be provided within controller 120 to carry out the digital signal processing functions or computations in accordance with embodiments of the disclosure, or any other modules or sub-modules of the system is illustrated in FIG. 26.
- processing system 1800 may comprise controller 1801 and user interface 1802.
- User interface 1802 is arranged to enable manual interactions between a user and the computing module as required and for this purpose includes the input/output components required for the user to enter instructions to provide updates to each of these modules.
- components of user interface 1802 may vary from embodiment to embodiment but will typically include one or more of display 1840, keyboard 1835 and optical device 1836.
- Controller 1801 is in data communication with user interface 1802 via bus 1815 and includes memory 1820, processing unit, processing element or processor 1805 mounted on a circuit board that processes instructions and data for performing the method of this embodiment, an operating system 1806, an input/output (I/O) interface 1830 for communicating with user interface 1802 and a communications interface, in this embodiment in the form of a network card 1850.
- Network card 1850 may, for example, be utilized to send data from these modules via a wired or wireless network to other processing devices or to receive data via the wired or wireless network.
- Wireless networks that may be utilized by network card 1850 include, but are not limited to, Wireless-Fidelity (Wi-Fi), Bluetooth, Near Field Communication (NFC), cellular networks, satellite networks, telecommunication networks, Wide Area Networks (WAN) and etc.
- Memory 1820 and operating system 1806 are in data communication with processor 1805 via bus 1810.
- the memory components include both volatile and non-volatile memory and more than one of each type of memory, including Random Access Memory (RAM) 1823, Read Only Memory (ROM) 1825 and a mass storage device 1845, the last comprising one or more solid-state drives (SSDs).
- RAM Random Access Memory
- ROM Read Only Memory
- mass storage device 1845 the last comprising one or more solid-state drives (SSDs).
- SSDs solid-state drives
- the memory components described above comprise non -transitory computer-readable media and shall be taken to comprise all computer-readable media except for a transitory, propagating signal.
- the instructions are stored as program code in the memory components but can also be hardwired.
- Memory 1820 may include a kernel and/or programming modules such as a software application that may be stored in either volatile or non-volatile memory.
- processor is used to refer generically to any device or component that can process such instructions and may include: a microprocessor, a processing unit, a plurality of processing elements, a microcontroller, a programmable logic device or any other type of computational device. That is, processor 1805 may be provided by any suitable logic circuitry for receiving inputs, processing them in accordance with instructions stored in memory and generating outputs (for example to the memory components or on display 1840). In this embodiment, processor 1805 may be a single core or multi-core processor with memory addressable space. In one example, processor 1805 may be multi-core, comprising — for example — an 8 core CPU. In another example, it could be a cluster of CPU cores operating in parallel to accelerate computations.
Landscapes
- Health & Medical Sciences (AREA)
- Epidemiology (AREA)
- Pain & Pain Management (AREA)
- Physical Education & Sports Medicine (AREA)
- Rehabilitation Therapy (AREA)
- Life Sciences & Earth Sciences (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Rehabilitation Tools (AREA)
Abstract
Disclosed herein is an upper limb assistive system and method. The upper limb assistive system is configured to receive a first support position and a first support orientation of the support member in an actuator frame, the support member coupled to the upper limb of the subject; determine a shoulder position based on at least one anthropometric measurement; based on the shoulder position, transform the first support position and the first support orientation in the actuator frame to a corresponding second support position and a corresponding second support orientation in a subject shoulder frame, wherein the subject shoulder frame corresponds to the shoulder position; estimate upper limb joint angles corresponding to a plurality of upper limb parts, based on the second support position and the second support orientation using a human inverse kinematics model; and determine a gravity-compensated force on the support member based on the upper limb joint angles.
Description
UPPER LIMB ASSISTIVE SYSTEM AND METHOD
CROSS-REFERENCE TO RELATED APPLICATION
[0001] This application claims the benefit of priority to the Singapore application no. 10202302472T filed September 4, 2023. the contents of which are hereby incorporated by reference in their entirety for all purposes.
TECHNICAL FIELD
[0002] This application relates to rehabilitation and assistance, and more particularly to an upper limb assistive system and an upper limb assistive method. The system may assist patients in self-conducting ADL (Activities of Daily Living) tasks and their interaction with real objects.
BACKGROUND
[0003] Rehabilitation plays an important role in improving the quality of life of a subject with mobility impairment of the upper limb. Current technologies have become increasingly prevalent in physical therapy, primarily targeting clinical settings. However, state-of-the-art systems mainly focus on providing rehabilitation effects rather than assistive functions for ADL tasks. Substantial evidence suggests that patients frequently encounter challenges in applying the skills acquired from robot-assisted therapy to their daily activities. Therefore, there is demand for a compact and versatile system for use not only in physical therapy but also for daily assistance.
SUMMARY
[0004] According to an aspect, disclosed herein an upper limb assistive system. The upper limb assistive system, may comprise: a support member for supporting an upper limb of a subject; an actuator coupled to the support member to move the support member; and a
computing module in signal communication with the actuator, the computing module comprising a processing unit; and a non-transitory media having media storing instructions readable by the processing unit, the media storing instructions that when executed by the processing unit, causes the processing unit to: receive a first support position and a first support orientation of the support member in an actuator frame, the support member being coupled to the upper limb of the subject; determine a shoulder position of the subject based on at least one anthropometric measurement of the subject; based on the shoulder position of the subject, transform the first support position and the first support orientation in the actuator frame to a corresponding second support position and a corresponding second support orientation in a subject shoulder frame, wherein the subject shoulder frame corresponds to the shoulder position of the subject; estimate a plurality of upper limb joint angles corresponding to a plurality of upper limb parts, based on the corresponding second support position and the corresponding second support orientation using a human inverse kinematics model corresponding to the subject; and determine a gravity-compensated force on the support member based on the plurality of upper limb joint angles.
[0005] According to another aspect, disclosed herein an upper limb assistive method. The upper limb assistive method comprises: receiving a first support position and a first support orientation of a support member in an actuator frame, the support member being coupled to an upper limb of a subject; determining a shoulder position of the subject based on at least one anthropometric measurement of the subject; based on the shoulder position of the subject, transforming the first support position and the first support orientation in the actuator frame to a corresponding second support position and a corresponding second support orientation in a subject shoulder frame, wherein the subject shoulder frame corresponds to the shoulder position of the subject; estimating a plurality of upper limb joint angles corresponding to a plurality of upper limb parts, based on the corresponding second support position and the corresponding
second support orientation using a human inverse kinematics model corresponding to the subject; and determining a gravity-compensated force on the support member based on the plurality of upper limb joint angles.
BRIEF DESCRIPTION OF THE DRAWINGS
[0006] Various embodiments of the present disclosure are described below with reference to the following drawings:
[0007] FIG. 1 is a schematic diagram of an upper limb assistive system according to embodiments of the present disclosure;
[0008] FIG. 2 shows an exemplary actuator according to various embodiments;
[0009] FIG. 3 is a process flow diagram of the upper limb assistive system according to various embodiments;
[0010] FIG. 4 is another process flow diagram of the upper limb assistive system according to various embodiments;
[0011] FIG. 5 is a perspective schematic view' of a subject and a sagittal plane according to various embodiments;
[0012] FIG. 6 is a front view of FIG. 5;
[0013] FIG. 7 is a schematic showing an upper limb of a subject and a proposed ARAE robotic system. The upper limb or human arm is modeled as a four degree of freedom link mechanism, including 3 revolute joints at the shoulder joint and 1 revolute joint at the elbow joint (E) under the human shoulder base coordinate Os- The human shoulder base frame is denoted as {Os}:Os_xsyszs. S, E and W represent the position vector of the shoulder joint, elbow joint, and wrist joint under the human shoulder base coordinate.
[0014] FIG. 8 is a system schematic map of the ARAE system of FIG. 7;
[0015] FIG. 9 shows a system control framework demonstrating the interaction between the subject/human and the ARAE robotic system;
[0016] FIG. 10 is a schematic diagram for obtaining the shoulder position for a sagittal plane model;
[0017] FIG. 11 shows a subject using the ARAE system;
[0018] FIG. 12 shows a proposed system for evaluation;
[0019] FIG. 13 shows the placement of sEMG and makers on the subject;
[0020] FIG. 14 is a plot showing group classification based on the distance from the realtime elbow joint to the calibrated shoulder joint as the percentage of the actual upper limb length.;
[0021] FIG. 15 shows a comparison between Mocap measured angles and estimated joint angles derived from the Fixed torso model and Sagittal plane model;
[0022] FIG.16 shows a performance comparison of two models (fixed torso model and sagittal plane model) on four types of motion patterns;
[0023] FIG. 17A shows an example of the sEMG profile for the Biceps Brachii (BB) in one subject while performing the ’Forward Reaching’ task.
[0024] FIGs. 17B and 17C illustrate the net changes in EMG for four muscle activation when transitioning from No Robot mode to With Robot mode, under the fixed torso model (Exp2-2) and the sagittal plane model (Exp2-3), respectively;
[0025] FIG. 18 is a schematic diagram illustrating an upper limb assistive method according to various embodiments of the present disclosure; and
[0026] FIG. 19 shows a block diagram of a processing system for implementing embodiments of the present disclosure.
DETAILED DESCRIPTION
[0027] The following detailed description is made with reference to the accompanying drawings, showing details and embodiments of the present disclosure for the purposes of illustration. Features that are described in the context of an embodiment may correspondingly be applicable to the same or similar features in the other embodiments, even if not explicitly described in these other embodiments. Additions and/or combinations and/or alternatives as described for a feature in the context of an embodiment may correspondingly be applicable to the same or similar feature in the other embodiments.
[0028] In the context of various embodiments, the articles “a”, “an” and “the” as used with regard to a feature or element include a reference to one or more of the features or elements.
[0029] Tn the context of various embodiments, the term “about” or “approximately” as applied to a numeric value encompasses the exact value and a reasonable variance as generally understood in the relevant technical field, e.g., within 10% of the specified value.
[0030] As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items.
[0031] The term “pose” may refer to a position and an orientation of an object in a frame. The term “position” may refer to a location or coordinate (for example, X-coordinate, Y coordinate, Z coordinate) of an object or part of an object in a space or a frame. The term “orientation” may refer to a facing or angle (for example, a X-direction vector, a Y-direction vector, a Z-direction vector) of an object or part of an object in a space or a frame. Each of the terms “pose”, “position” and “orientation” may be defined by a coordinate system, such as a cartesian coordinate. The term “frame” may refer to a set of coordinates in which measurements of sizes, pose, positions, orientations, motions, etc. may be made.
[0032] As used herein, the term “anthropometric measurement” refers to quantitative measurements of the body of a subject. The anthropometric measurement may be non-invasive
in nature, used to assess the size, proportions and composition of the human body. Examples of anthropometric measurement may include height, recumbent length, circumferences (head, waist, hip, mid-upper arm, mid-thigh, calf, chest, neck), limb lengths (arm-span, demi-span, half-span), abdominal sagittal diameter, etc.
[0033] With the ability to provide a substantial range of motion (ROM), end-effector type systems may be suitable for providing upper limb assistance for a subject. However, one limitation of end-effector type systems includes bulkiness which limits the feasibility for homebased therapy. Further, the need for attaching sensors to the subject in measuring the upper limb joint angles adds bulk and hinders movement of the subject’s upper limb. Non-contact sensors, such as image-based sensors, which does not add onto the bulk of the system often experience inaccuracies during the measurement and are typically expensive. Tn addition, due to the lack of correction/limit during trajectory tracking, there is a lack of safety movement range which may result in over extension or over movement of the subject’s upper limb during rehabilitation. [0034] To effectively support task-oriented rehabilitation and Activities of Daily Living (ADL) assistance to the subject, the assistive system may provide gravity compensation (GC) forces to the upper limb. The GC forces may reduce the muscular effort of the subject, enhancing the transparency of movement during rehabilitation. However, one challenge with providing GC forces for end-effector type systems is the determination of the GC forces during operation. The GC forces may change responsive to the upper limb position and/or orientation (or pose) in three-dimensional space. Conventional end-effector type systems measure the human joints angles during movements with wearable magnetic sensors and compute the support force based on the human dynamics model. However, the wearable magnetic sensors are often cumbersome intervening in the subject’s movement and introduce complexity during operation. In addition, inaccuracy may be introduced to the readings from the magnetic sensors due to sensor drifts and/or noise during measurements.
[0035] The present disclosure proposes an upper limb assistive system and an upper limb assistive method. According to various embodiments, in addressing the various challenges, the proposed upper limb assistive system is mobile, light weight and does not require attaching sensors on the subject during rehabilitation. In addition, the proposed assistive system may concurrently perform motion tracking as well as provide a gravity-compensated (GC) force to fully support the upper limb of the subject. Easy implementation of the proposed system facilitates home-based therapy and/or rehabilitation sessions.
[0036] In an exemplary scenario, the proposed upper limb assistive system and method may be used to provide a gravity-compensated (GC) force or supporting force to a subject or patient during rehabilitation, such as during physiotherapy. In another an exemplary scenario, the proposed upper limb assistive system and method may be used to provide a gravity- compensated (GC) force or supporting force to a subject when performing an upper limb activity, such as using a touch screen device.
[0037] According to various embodiments, the proposed upper limb assistive system may determine various upper limb joint angles, such as the shoulder angles and the elbow angle. The proposed system may estimate or determine the upper limb joint angles based on a position and orientation of a forearm of the subject. In some embodiments, the position and orientation of the forearm of the subject may be determined based on a position and an orientation of a support member coupled to the forearm. Further, the system may determine and provide a gravity- compensated force or a supporting force to support the arm of the subject during movement/activities.
[0038] FIG. 1 illustrates an upper limb assistive system 100 for providing assistance or support to a subject 90, according to various embodiments of the present disclosure. The system 100 may be a personalized system 100 customizable according to each user/subject. For example, the system 100 may be configured or customized based on anthropometric
measurements of the subject 90. The system 100 may include an actuator 110 with a base 112 which acts as a reference defining an actuator frame {ORO}. The actuator 110 may further include an actuator arm 114 coupling a support member 116 to the base 112. The actuator 110 may be configured to move the support member 116 or provide a force to the support member 116 via the actuator aim 114. In va ious embodiments, the actuator 110 may be in signal communication with a controller or a computing module 120. The computing module 120 may control the actuator 110 to vary a support pose (comprising a support position and a support orientation) of the support member 116.
[0039] In various embodiments, the support member 116 may be coupleable or attachable to an upper limb 91 of a subject 90. The support member 116 may support the upper limb 91 of the subject 90 during activity. Referring to FIG. 1 , the upper limb 91 may include a plurality of upper limb parts such as shoulder joint 92, an upper arm 93, an elbow joint 94, a forearm 95, a wrist joint 96, and a hand 97. The support member 116 may be suitably shaped/formed to conform to a surface or a contour of the upper limb 91 or a part of the upper limb 91, such as the upper arm 93. As examples, the support member 116 may be a forearm cuff, an upper arm cuff or a splint.
[0040] In various embodiments, the support member 116 may be attached to the forearm 95 of the subject 90. The support member 116 may be attached between the elbow joint 94 and the wrist joint 96 of the subject 90. In some examples, the support member 116 may be attached to the forearm 95 such that the support pose (comprising the support position and the support orientation) of the support member 1 16 is representative or substantially representative of a forearm pose (forearm position and forearm orientation) of the forearm 95 of the subject 90. In some examples, straps of a forearm cuff may be provided to secure the forearm 95 of the subject
90 to the support member 116. In other embodiments, the support member 116 may be attached
to the upper arm 93 of the subject 90. Hence, the support member 116 may also be attached between the shoulder joint 92 and the elbow joint 94.
[0041] In some implementations as shown in FIG. 2, the actuator 110 may include a base 112 coupled to an actuator arm 114 configured as a linkage structure 114. The linkage structure 114 may further be coupled to the support member 116. The linkage structure 114 may include a plurality of linkage members operably coupled via a plurality of joints 1 15/1 17. In some embodiments, the plurality of joints may include active joints 115 or actuated joints as well as passive joints 117. Active joints 115 are actuatable joints and passive joints 117 are non- actuatable joints. In some embodiments, each of the active joints 115 may be driven by a respective driving member, such as a quasi-direct drive motor. Quasi-direct drive motors may provide a high torque and backdrivability, which enhances the performance of physical humanrobot interaction (pHRI). In other embodiments, each of the active joints may be driven by a respective Series Elastic Actuator (SEA) which provides high backdrivability despite having limitations to the bandwidth.
[0042] The system 100 may include sensors for measuring a support pose (comprising a support position and a support orientation) of the support member 1 16 in the actuator frame { ORO}. In some embodiments, the sensors may be encoders coupled to each of the joints, active joints and passive joints. Alternatively, the sensors may be integrated with each of the driving member or motors for measuring the pose of the support member 116 in the actuator frame { ORO}. In the example as shown in FIG. 2, the linkage structure 114 may be a four-bar linkage structure which includes three active joints 1 15 driven by quasi-direct drive motors and two passive revolute joints 117. In some embodiments, a mechanical limit 118 may be provided to limit the range of motion (ROM) of the actuator 110 and the upper limb 91 of the subject 90. [0043] Further referring to FIG. 3, according to various embodiments of the upper limb assistive system 100, the computing module 120 may be configured to receive measurement
data from the sensors to determine a first support pose 210 in the actuator frame { ORO}, or in other words, a first support position 210a and a first support orientation 210b of the support member 116 in the actuator frame { ORO}. The support member 116 may be coupled to the forearm 95 of the subject 90, and movable/displaceable by both the subject 90 and the actuator 110. Hence, the first support pose 210 may be reflective or representative of the forearm pose (forearm position and forearm orientation) of the forearm 95 of the subject 90 in the actuator frame { ORO}. In some examples, the first support position 210a and the first support orientation 210b of the support member 216 corresponds to the elbow position 94 and the wrist position 96 of the subject 90 respectively. In some examples, the first support position 210a and the first support orientation 210b of the support member 216 corresponds to the position and the orientation of the forearm 95 of the subject 90.
[0044] In various embodiments, the computing module 120 may determine the first support position 210a and the first support orientation 210b based on relative position between the plurality of linkage members. For example, the first support position 210a and the first support orientation 210b may be determined based on each of the joint angles of the various joints (active joints and passive joints). It may be appreciated that at the present step, the first support position 210a and the first support orientation 210b of the support member 116 may be measured with reference to the actuator frame { ORO}. In other embodiments, without the need for separate sensors, the actuator 110 may provide the first support position 210a and the first support orientation 210b directly to the computing module 120.
[0045] Responsive to receiving the first support pose 210 of the support member 1 16, the computing module 120 may determine a shoulder position 220 of the subject 90 based on one or more anthropometric measurements of the subject 90. The shoulder position 220 may be represented as a position in the actuator frame { ORO}. The shoulder position 220 may be a fixed position in the actuator frame { ORO}. The anthropometric measurements may include a length
of the upper arm 93, a length of the forearm 95, a torso width of the subject 90, a distance between a hip and a shoulder of the subject 90, etc.
[0046] In various embodiments, based on the shoulder position 220 of the subject 90, the computing module 120 may transform the first support pose 210 in the actuator frame {ORO} to a corresponding second support pose 230 in a subject shoulder frame {Os}- In other words, the first support position 210a and the first support orientation 210b in the actuator frame {ORoJmay be transformed to a second support position 230a and a second support orientation 230b in the subject shoulder frame {Os}. The second support position 230a corresponds to the first support position 210a. The second support orientation 230b corresponds to the first support orientation 210b. In some embodiments, the subject shoulder frame {Os} may correspond to the shoulder position 220 of the subject. In some embodiments, an origin of the subject shoulder frame { Os } may be positioned at the shoulder position 220.
[0047] In various embodiments, using a human inverse kinematics model corresponding to the subject 90, the computing module 120 may estimate or determine the upper limb joint angles 240 of the subject 90 corresponding to the upper limb parts, based on the second support position 230a and the second support orientation 230b. The upper limb joint angles 240 may include multiple shoulder angles such as shoulder abduction/adduction angle, shoulder flexion/extension angle, shoulder internal/external rotation angle, and elbow angle such as elbow flexion/extension angle. It may be appreciated that the human inverse kinematics model may include inputs of anthropometric measurements of the subject 90.
[0048] In various embodiments, the computing module 120 may determine a gravity- compensated (GC) force 250 on the support member 116 based on the plurality of upper limb joint angles 240. Further, the computing module 120 may control or actuate the actuator 110 to provide the GC force to the upper limb 91 of the subject 90 via the support member 116. The GC force 250 may hold the upper limb 91 of the subject 90 to provide an upper limb support to
the subject. In various embodiments, for full upper limb assistance, the GC force 250 may include a magnitude such that the all weight of the upper limb 91 of the subject 90 is supported by the support member 116. In other embodiments, for physiological training or rehabilitation, the GC force 250 may include a magnitude such that only a portion of the weight of the upper limb 91 of the subject 90 is supported by the support member 116. It may be appreciated that the above-described process may be iteratively performed to provide the GC forces 250 to the subject 90 in the course of activity or rehabilitation.
[0049] In some embodiments, the computing module 120 may determine the shoulder position 220 based on a predetermined and fixed position. Hence, the shoulder position 220 may be a fixed position in relative to the actuator 110, or in other words, the shoulder position 220 may be fixed in relative to the actuator frame { ORO}. In such embodiments, it is assumed that the torso and the shoulder position 220 of the subject 90 do not change during the course of activity. Hence, the computing module 120 may determine the shoulder position 220 based on at least one anthropometric measurement such as a torso length of the subject 90. However, a primary limitation of this approach is that in practical scenarios, the shoulder position 220 typically shifts in conjunction with torso movements thus introducing some inaccuracy to the method.
[0050] Due to the movement of the shoulder position during activity, establishing the shoulder position as a fixed position may introduce inaccuracies. To address this limitation, according to various embodiments as shown in FIGs. 4 to 6, it is proposed to determine the shoulder position via a sagittal plane 80 and a fixed subject pelvis frame {Op}, which is closer to the actual situation when the subject 80 is performing an upper limb 91 activity, especially when the subject 90 is seated. Despite potential torso movements in various directions, the fixed subject pelvis frame {Op} provides a stable reference point.
[0051J According to various embodiments, the computing module 120 may transform the first support pose 210 (the first support position 210a and the first support orientation 210b) in the actuator frame {ORO} to a corresponding intermediate support pose 211 (an intermediate support position 211a and an intermediate support orientation 211b) in a subject pelvis frame {Op}. The computing module 120 may transform the first support pose 210 to the corresponding intermediate support pose 21 1 based on one or more anthropometric measurements of the subject 90 and the sagittal plane 80 in the subject pelvis frame {Op}. As illustrated in FIGs. 5 and 6, the sagittal plane 80 may be defined by a hip joint position 98 of the subject. For example, the sagittal plane 80 may be tangent to the hip joint position 98.
[0052] According to various embodiments, the computing module 120 may form a first arc 213 in the sagittal plane 80. The first arc 213 may be formed with a first center and a first radius. The first center may be the hip joint position 98, and the first radius may be a first anthropometric measurement, such as distance R1 between a hip joint position 98 and a shoulder joint 92 of the subject 90. Further, computing module 120 may form a second arc 215 in the sagittal plane 80. The second arc 215 may be formed with a second center and a second radius. The second center may be a projected elbow joint 99 in the sagittal plane 80, and the second radius may be a second anthropometric measurement, such as a distance R2 between the projected elbow joint 99 and the shoulder joint 92 of the subject 90. Upon forming the first arc 213 and the second arc 215, the computing module 120 may determine the shoulder position 220 based on an intersection of the first arc 213 and the second arc 215 in the sagittal plane 80. [0053] Based on the shoulder position 220 as determined from the first arc 213 and the second arc 215, the computing module 120 may, transform the intermediate support pose 211 (the intermediate support position 211a and the intermediate support orientation 211b) in the subject pelvis frame {Op} to a corresponding second support pose 230 (the second support position 230a and the second support orientation 230b) in a subject shoulder frame {Os}.
[0054] Similar to previous embodiments, the computing module 120 may then estimate or determine the upper limb joint angles 240 of the subject 90 corresponding to the upper limb parts, based on the second support position 230a and the second support orientation 230b, using a human inverse kinematics model corresponding to the subject 90. Thereafter, to determine a gravity-compensated (GC) force 250 on the support member 116 based on the plurality of upper limb joint angles 240 and to actuate the actuator 1 10 to provide the GC force to the upper limb 91 of the subject 90 via the support member 116.
[0055] Assistive Robotic Arm Extender (ARAE)
[0056] According to an exemplary embodiment, FIG. 7 shows a schematic of the proposed upper limb assistive system 100, otherwise known as the Assistive Robotic Arm Extender (ARAE). FIGs. 8 and 9 are system block diagrams of the ARAE. The ARAE is configured to provide arm support in three-dimensional (3D) space for functional task training. The ARAE is capable of achieving high transparency in movement within 3D space and offers adaptive arm support based on estimated human postures. Configurations of the ARAE allows the ARAE to assist patients with Muscle Manual Testing (MMT) scores ranging from 1 to 4, in performing ADLs and interacting with actual environments.
[0057] As shown in FIG. 7, the ARAE is provided with Quasi direct drive (QDD) motors, encoders, and a parallel mechanism, incorporating three active Degrees-of-freedom (DOFs) and two passive DOFs. The ARAE system includes an adaptive control framework for gravity compensation on the human arm, via determining a compensatory force based on the determined estimation of the subject’s postures. The proposed ARAE system comprises two components: i) Human joint angle estimation and 2) Upper limb support force determination.
[0058] As shown in FIG. 7, the ARAE comprises a 3-DOF actuated robotic arm or actuator 110 with a base 112 and a linkage structure 114. An end-effector module or support member 116 couples the linkage structure 114 via two passive joints. Motor 1 (motor 115a) mounted on
the base 112 enables rotary movement of the linkage structure 114 at the base 112. Motor 2 (motor 115b) and Motor 3 (motor 115c) are positioned atop Motor 1. Motor 2 drives Link 1 (link 114a) and Motor 3 drives Link 2 (link 114b). Link 1 and Link 2 together with Link 3 (link 114c) and Link 4 (link 114d) form a parallelogram mechanism, with Link 4 as the output link that interfaces with the end-effector module 116. Mechanical limits may be provided to act as the protective limit for Motors 2 and 3.
[0059] The Links 1 to 4 are fabricated from carbon fibre tubes, and hence configured with high stiffness and minimal weight. This enables a compact configuration, optimizing the transmission of actuator torque from the robotic arm to the subject. The end-effector module 116 is attached to Link 4, secured thereto with a clamping mechanism on Link 4 and an antirotation screw. For motion tracking and feedback, two encoders 1 17 with 2-DOF movement are provided coupled to the end-effector module 116, with the respective output shafts of the encoders 117 serving as the rotation axes. The end-effector module 116 includes with a forearm cuff, to securely hold the subject’s forearm 95 while allowing slight rotational adjustment following the curve beneath the cuff.
[0060] Achieving the mechanical adaptability of wearable robots, including assistive or rehabilitation robots, relies heavily on high-performance actuators. Criteria for the actuators include lightweight, highly backdrivable, and a high bandwidth. The ARAE includes three QDD motors (TMotor AK10-9 V2.0) to provide the 3-DOF movement with each of the QDD motors with a peak torque of 48Nm. The maximum external load applicable at the end-effector module is 12.43kg when the parallel mechanism extends to a position of maximum working range. At this position, the end-effector load applies the largest moment on the motors. The peak loads at the end-effector may support the arm weight of 99% of human subjects and facilitate user activity. The ARAE is suitably configured for a variety range of human body and strength training. The maximum joint speed of 26 rad/s provided is approximately 8 times
higher than the speeds typically required in ADL tasks. The back drive torque is 0.8Nm generating high performance of transparency during human-robot interaction (HRI). The safety limits are set for position, velocity, and torque control.
[0061] Referring to FIG. 8, an STM32F429 (STMicroelectronics) microcontroller (MCU) acts as the embedded system to communicate with the encoders and T-Motors through the CAN (Controller Area Network) bus while connected with the Linux application through a serial port. The MCU communicates with the external ADC chip AD7606 through SPI (Serial Peripheral Interface). The sampling frequency of the encoder feedback and the low-level actuator control loop is fixed at 1kHz. A computer running Linux is used as a host PC for the data logging and user interface. The communication between MCU and the host PC is written in C++ as nodes for the robot operating system (ROS). The PC -based Graphical User Interface (GUI), developed in C++, facilitates real-time monitoring of the ARAE’s operational status.
[0062] Forward kinematics model of the ARAE determines the end-effector position of ARAE with the five input joint angles qi (i = 1, 2, 3, 4, 5). As shown in FIG. 7, the {
refers to the robot base coordinate and the refers to the coordinate
of end-effector position of the robotic arm. The Denavit-Hartenberg (D-H) algorithm was utilized for the derivation of the kinematics model of the ARAE.
[0063] As shown in FIG. 7, the joint angles and structural parameters are assigned to each joint and link. Since the main drive mechanism of ARAE is a parallelogram structure, the kinematic model can be simplified as follows serial-link mechanism, represented by the dashed line in FIG. 7. The Denavit-Hartenberg (D-H) algorithm is applied to derive the kinematics model.
[0064] The Denavit-Hartenberg (D-H) algorithm is applied to derive the kinematics model of ARAE. The D-H parameters are defined in the following Table 1. In this table, q22 can be
articulated as a function of q3 (Equation 1), leveraging the properties of the parallelogram mechanism. The lengths of links are shown in Table 2.
Where c is cos and s is sin. The equation of the resultant matrix 0T5 specifies the fifth frame to the base coordinate of the robot following the chain product which is denoted as:
[0066] The vector of end-effector position is
R4xl under robot based frame, which follows the Equation.4. The two endpoints of the cuff are denoted as
in the robot base coordinate system, which can be defined as:
where ps, pe and p, represent the 5th, 6th, and 7th joint positions, respectively, in the specific local coordinates. The transformation matrices 5T6 and 6T7 represent the transformations from the 5th joint to the 6th joint and from the 6th joint to 7th joint, respectively.
[0067] As the ARAE has three active joints (q1, q2, q3), the Jacobin matrix maps the first- order differential relationship between active joints and the position of joint 4 (modified endeffector position) Rp3 in Cartesian space. The robot Jacobian matrix is represented by JR E R3X3. individual elements Jr is as follows:
where
where
[0068] The inverse kinematics (IK) model facilitates the achievement of a fully passive control mode for the ARAE. This allows the robot to manoeuvre the patient’s arm to a predefined position without necessitating any muscular effort on the patient’s part. Given that the ARAE is equipped with two passive joints, denoted as q4 and qs, the general inverse
kinematics model can yield an infinite set of solutions represented by qt. Therefore, the inverse kinematics (IK) model is constrained to only computing the active joint angles using the Rps input. The Rps is denoted as [x, y, z]T. The following equations are used to solve three active joints with the input of [x, y, z] 1 :
[0071] The dynamic model of the ARAE achieves gravity compensation of the mechanical structure. The gravity compensation feature of the ARAE enables the entire system to operate in “transparent mode”, necessitating only minimal externally applied force from the subject. Furthermore, the robotic system is capable of maintaining a stable hover at the target position upon withdrawal of the external force. The derived Euler-Lagrange dynamics in joint space are governed by the following Equation:
[0072] where qt, q 'i, q"i are limited to the three input joint angles (z = 1, 2, 3) corresponding to the three active joints / active joint angles of the ARAE. The Mu(qt) represents the mass (inertia) matrix of robot, the Cx(qi, q i) refers to the Coriolis and centripetal matrix, the Gx(qi) is the gravity vector and TR is the required joint vectors of three motors. Assuming the absence of inertia in the robot system, only the gravity term Gn(qi) is considered for calculating the compensated joint torques.
[0073] The ARAE provides adaptive arm gravity compensation means such that the support force provided by the robot at the end-effector varies with the arm posture of the subject. As shown in FIG. 9, the entire control framework reflects the interaction between the human-robot system. The system refers to the proposed adaptive GC of the human arm control framework. The proposed adaptive gravity compensation of the human arm is represented by the joint angle estimation method and calculation of the human required support force. The system may estimate or determine the human joint angles hj based on the Ps, Pe and Pw obtaining from a fixed torso model or a sagittal plane model. Secondly, support force Fh for the human artn/uppcr limb is computed, and the torque Th is provided by the robot. The reference torque Tref is fed into the motor controller of robot system Tref = TC + TR. TR is the gravity torque of the robot structure. [0074] Human joint angle estimation method
[0075] Two methods are proposed for estimating human joint angles: 1) the fixed torso model and 2) the sagittal plane model. Since the human forearm is securely fastened to the cuff using Velcro, the positions of the elbow and wrist joints may be aligned with the two endpoints of the cuff. As shown in FIG. 7, the elbow and wrist points of the subject are estimated to coincide with Rpe and Rp? (mentioned in the supplementary material S.I), respectively, as expressed in the following manner:
where Rpe and Rpw are the human elbow and wrist positions in robot base coordinate.
[0076] 1) Fixed torso model: In this model, human torso movement is constrained to a fixed position, allowing the shoulder point S to be assumed as stationary. After requiring the elbow and wrist position under robot based frame, the Rpu and Rpw are transferred under the human shoulder base frame {Os}:Os ~ xsyszs, as shown in the following equation.
where STR is the transformation matrix from the robot base frame to the human shoulder base frame. The rotational angle \|/ equal to —nJ2. The XSR, ysR, ZSR represent the position of the original point of the robot base coordinate in the fixed human shoulder frame. These values must be entered into the PC GUI as initial parameters. The spE and spw are referred to as elbow position and wrist position under the fixed human shoulder frame.
[0077] Thereafter, the human inverse kinematics model may be used to derive the human joint angles hj, including shoulder abduction/adduction (hi), shoulder flexion/extension (hi), shoulder internal/extemal rotation (ha), and elbow flexion/extension (ha.)- The established human inverse kinematics model is given by:
where the spE and spw are the position vectors input to the human IK model. The IF represents the length of the forearm, defined as an initial parameter subject to anthropometric data. An assumption was made that the original point of the shoulder frame under the human shoulder frame is denoted as ps, and it is a fixed position. Therefore, the length from spE to the fixed shoulder point is calculated as lucai rather than directly using the actual human upper arm length (lu) due to the change of derived spE-
[0078] However, the fixed torso model includes an assumption that the shoulder position remains fixed in the course of movement. In practical scenarios, the shoulder position typically shifts in conjunction with torso movements, particularly during actions like reaching for distant positions. Addressing the above, a Sagittal plane model is proposed and described in the following section.
[0079] 2) Sagittal plane model: Tn the Sagittal plane model, the shoulder position ps is assumed to move within the sagittal plane of the human torso. Due to the movement of the shoulder position, establishing the human base coordinate at the shoulder as a fixed reference frame becomes impractical. Therefore, it is advisable to relocate the human base frame to the center of the pelvis. This approach allows the assumption that the original point of the human pelvis base coordinate, denoted as { Op}:Op - xp yp zp . This position remains relatively fixed, especially when the user is seated, as illustrated in FIG. 10.
[0080] Despite potential torso movements in various directions, this assumption provides a stable reference point. As mentioned above, the elbow position and wrist position (Rpr and Rpw) can be derived from the robot forward kinematics model through Equation 2 and 3. Unlike the fixed torso model, the Rpr and Rpw are transferred to the human-based pelvis frame, as shown in the following equation.
where PTR refers to the matrix from the robot frame to the human pelvis frame. equal to -idl. The XPR, ypR, ZPR represent the position of the original point of the robot base coordinate in the fixed human pelvis frame. These values were entered into the PC GUI as initial parameters. The ' pi and Ppw are the elbow and wrist positions under the pelvis coordinate.
[0081] This establishes the geometric model in the sagittal plane. As shown in FIG. 10, E’ is the projection of the elbow joint in the sagittal plane and the hip joint position H is assumed to be located on the sagittal plane and xp axis. ISH refers to the initial parameter from the hip to the shoulder. This parameter is an anthropometric value, unique to each user’s torso length. 1PH is another parameter referring to half of the torso width specific to different subjects. Thus, there are two arcs formed in the sagittal plane. One is an arc with the hip as the center of the circle and Isn as the radius. The other is the arc with the projected elbow joint E’ as the center of the circle and ISE’ as the radius. Therefore, the intersection point of two arcs is the derived shoulder position under pelvis coordinate, denoted by Pps =[pps(x), Pps(y), Pps(z)]. The two constrained problems are written as follows:
[0082] Where the hip joint position is denoted as Ppn = [-IPH, 0, 0] and lu represents the length of the upper limb. Subsequently, the elbow' and wrist positions under the pelvis frame must be transferred to the derived shoulder frame, which can be denoted as:
where sTp refers to the homogeneous transformation matrix from fixed pelvis frame to derived shoulder frame. Finally, the spi. and spw are input to the human arm inverse kinematics model, as shown by Equation 6.
[0083] Arm gravity compensation strategy
[0084] After estimating the joint angles of the human arm, the required support force can be computed using the human arm dynamics model. As shown in FIG. 7, the human arm model is
modelled as a link mechanism with four degrees of freedom. The center of mass of these two links is shown as mu and mi,
[0085] The human arm dynamics model can be written as:
are the human arm joint angles and derivatives, r is the joint torque generated by the human arm. n = 4 denotes the four DOFs of the human arm model including three DOFs in the shoulder joint and one DOF in the elbow joint. The robot applies a support force F/, to the human arm, forming an external torque Rr that primarily compensates for the arm’s gravity term Gh, as follows:
[0086] The required force of the end-effector to support the human arm’s weight can then be calculated using the human aim model as follows:
where hj is the estimated human joint angles. The JT#h (hj) is the pseudo-inverse of JTh(hj) S R3x4 and Gh is the gravity term of human arm.
[0088] The lu and IF represent the length of upper limb and forearm. In addition, the human arm gravity vector G/, (h/) refers to:
[0089] The COMF and COMu refer to the ratio of center of mass point of forearm and upper limb. Moreover, the my and my are the mass of upper limb and forearm.
[0090] The calculated force Fh varies in both magnitude and direction across the workspace, depending on the human arm joint angles hj. Subsequently, the calculated force Fh is mapped to the compensated torque for human arm Th in robot joint space as:
[0091] The resultant reference torque τref to be provided by the active motors is as follows:
where TR was derived from the robot dynamics Equation 1.
[0092] Experiments
[0093] To evaluate the proposed adaptive gravity compensation framework on the ARAE robot, the proposed human joint angle estimation methods were evaluated, follow by the evaluation of the effects of adaptive arm support force by surface Electromyography (sEMG).
[0094] A. Subject Information and Initial Calibration
[0095] The Institutional Review Board of Nanyang Technological University (IRB-2022- 821) approves the experimental protocol. After reviewing the informed consent form, four righthanded healthy subjects (4 males, 29 + 2 years old) were involved in the experiments. The mean mass of the participants is 75.05 ± 2.5kg and the mean height is 178 ± 4.23cm. The mean upper limb length (lu) is 29.91+0.25cm and the forearm length (IF) is 26.43 + 0.66cm. Moreover, the mean of trunk length (ISH) is 38.50 + 1.04cm and the mean of trunk width (IPH) is 17.93 ± 0.64cm.
[0096] Before the official experiments began, a calibration trial was performed to measure the initial parameters by the Mocap system, including the translation distance from the shoulder joint to the robot base (XSR, ysR, ZSR) and from the COM of the torso to robot base (XPR, ypR, ZPR), respectively. Additionally, kinematic parameters of the human body, such as arm length, trunk length, and trunk width in the upright position, were measured. The subjects were instructed to wear the robot and then sit at the table, maintaining an upright and stable torso
posture, as depicted in FIG. 11. During this setup, both the ARAE and the Mocap system recorded data for 10 seconds. The initial shoulder joint position was recorded under the mocap- based coordinate.
[0097] B. Experiment Protocol for Evaluating Angle Estimation Method
[0098] The first experiment (Expl) was conducted to evaluate the human joint angle estimation methods by the Mocap system. The subject wears the ARAE system and sits in the capture volume of the Mocap system. Six pre-defined positions are labelled on the table, shown in FIG. 12. The subject sits in front of a table and attaches the forearm to the ARAE, with the six labeled positions and one original/starting point located at the experimental table. The starting position is marked by the circular label that is closest to the human body. After each set of motions, the original position is set for resting the arm. The label 3 is the farthest position from the body, making the subject do trunk compensation movements in the sagittal plane. The rest of the labels are located in a square. All subjects performed five trials for six labelled positions. Each trial involved continuous movements, specifically reaching and drinking activities performed with a real cylinder, effectively simulating ADL (Activities of Daily Living) task training. The detailed procedures are illustrated as follows:
[0099] 1) Firstly, the subject’s arm starts by moving from the original position to the instructed label.
[00100] 2) After touching the labeled position, the subject lifts the cylinder and places it on their mouth.
[00101] 3) Finally, the subject brings their hand back to the labeled position and then returns to the original position.
[00102] When the subjects move towards distal positions (Label 3), the subjects need to move the torso in the sagittal plane, which can mimic the trunk compensation movement.
[00103] C. Experimental Protocol for Evaluating Effects on Human Ann
[00104] The second experiment (Exp2) was to assess the impact of the adaptive arm support control framework on the human arm by measuring muscle activity using sEMG. To be able to independently verify the effect of the proposed adaptive control method on the human arm, the subjects did not take a physical object in this experiment. Exp2 was divided into three experimental sessions based on the different auxiliary control modes (No Robot and With Robot). In Exp2-1 (No Robot mode), the subject did the tasks without wearing the robot. Meanwhile, Exp2-2 and Exp2-3 employed the adaptive arm gravity compensation framework as the control mode. Specifically, the angle estimation method for Exp2-2 is based on the fixed torso model, whereas Exp2-3 utilizes the sagittal plane model. Each of the experimental sessions included three tasks, illustrated as follows:
[00105] 1 ) Forward Reaching (FR): move the hand from the original position to label 3 position.
[00106] 2) Lateral Reaching (LR): move the hand from the original position to label 2, then label 5 position.
[00107] 3) Hand to Mouth (H2M): move the hand from the original position to the mouth.
[00108] D. Evaluation Methods
[00109] 1) Kinematic Data Preprocessing: To verify the proposed human joint angle estimation methods, the kinematic joint angles of the human arm were collected using the Qualisys Mique M3 motion capture (Mocap) system as the ground truth. The Mocap system includes 18 Qualisys A12MP cameras and a Qualisys task manager, an interface for managing the capture sessions and exporting the data at 200 Hz. The retroreflective markers were placed on each healthy subject’s body, including the thoracic spinous, right and left anterior superior iliac spines, sternum, upper arm, and forearm clusters, as shown in FIG. 13. Since each subject wore a sleeveless T-shirt, the circular magnets were used to ensure that the position of the
markers would not change for those markers that were covered by clothing at the torso. The system synchronization between the Mocap and ARAE robot is done by a DAQ board.
[00110] The markers’ locations recorded by the Mocap system were sampled at 200 Hz. Then, Visual3D - a professional software was used to transfer the marker location into human joint angles and each joint position under the Mocap world frame. Furthermore, the PC logged the corresponding data of the motors and encoders of the ARAE robot at 100 Hz. After the experiments, all the collected data were off-lined and analyzed by MATLAB R2022a. The kinematic data from Visual3D were downsampled to 100 Hz, which can synchronize with the measured data from the robot.
[00111] 2) sEMG Data Preprocessing: To evaluate the adaptive arm gravity compensation framework of ARAE on the human body, muscle activities were recorded by the wireless sEMG system (Cometa Picolite, ITALY) at 2000 Hz. As shown in FIG. 13, the EMG electrodes were placed on four upper limb muscles, including Pectoralis Major (PM), Deltoid Medial (DM), Bicep Brachii (BB), and Triceps Brachii (TB) following SENIAM guidelines. At the beginning of each session, each subject needs to do Maximum Voluntary Contraction (MVC) which was later used to normalize the EMG signals. The recorded data were offline processing which involved two stages of notch filtering (using an HR notch filter with a cutoff frequency of 50Hz to eliminate powerline interference and another at 1.67Hz to remove heartbeat noise); it was then subjected to high-pass filtering (via a 10th order Butterworth filter with a 20Hz cutoff frequency); the data was rectified by computing the absolute value; and subsequently smoothed with a low-pass filter (a 10th order Butterworth filter at a 4Hz cutoff frequency). All data from the sEMG channels were synchronized with the Mocap and ARAE systems through the DAQ board.
[00112] 3) Metrics for Angle Estimation Methods: The Mean absolute error (MAE) was used to evaluate the performance of the proposed angle estimation methods, which was denoted as:
where 9j is the ground truth human joint angles captured by the Mocap system and hj is the derived human joint angles using proposed models, n represents the total number of data frames in each trial.
[00113J 4) Metric for Adaptive Support Force: The EMG data was filtered, and then the mean averaged value (MAV) was calculated for each EMG channel in reference to each task of
Experiment 2. The percent change of the MAV (AMAV%) was conducted to measure the decrease in muscle activities from the control mode of Exp2-1 to Exp2-2 and Exp2-3, respectively. The AMAV% can be expressed as:
i e {2, 3} (53)
[00114] Results
[00115] A. Verification of Joint Angle Estimation
[00116] To verify the two proposed human joint angle estimation methods, the robot joint angles were input into the derived model which output the estimated human joint angles, including shoulder abduction/adduction (SA), shoulder flexion/extension (SF), shoulder intcmal/cxtcmal rotation (SR), and elbow flcxion/cxtcnsion (EF).
[00117] 1) Overall analysis: Regarding the overall analysis, the types of joint angles were not classified during the analysis. As shown in Table 3, the mean absolute error (MAE) was calculated for all trials, and this value averages the MAE of the four types of joint angles. The performance of the sagittal plane model is slightly better than the fixed torso model, which obtains 5.37°. However, there was no significant difference in the accuracy of the mean joint angles MAE (p = 0.54 Wilcoxon sign rank test).
[00118] 2) Local analysis: the impacts of distance from the target location on the joint angle estimation performance of two models were analyzed. The experimental data from all subjects (5*6*4 = 120 motions) were classified into 7 groups. The criteria for group assignment were based on the real-time elbow-to-initial shoulder joint distance as a percentage of the actual length of upper arm lu (fixed value refer to each subject). The percentage from 100% to 150% indicated that the subject’s torso moved forward in the sagittal plane and towards the experimental table. The group with 80% to 100% illustrated that the subject’s torso moved backward in the sagittal plane and away from the table during experiments. As shown in FIG. 14, with the increment of percentage from 100%, the joint angle estimation MAE of the fixed torso model dramatically increases from 4.86° to 16.17 °. However, the MAE of the Sagittal plane model rises slowly from 5.08 0 to 11.76 °. The paired t-test was applied for statistical analysis. There were significant differences in the estimated performance of the two models in terms of the last three groups (p = 0.04, 0.007, and 0.028). As illustrated in Expl, the label 3 position was designed for evaluating the sagittal plane model. FIG. 15 shows the angle estimation results for the label 3 of subject 4 as well as a comparison of the changes between the estimated angles of two models and Mocap measured angles (assumed as Ground truth).
[00119] Results were obtained from the label 3 position of Subject 4, demonstrated the change of four joint angles in 30 seconds. In FIG. 15, subfigurc (a) shows Shoulder Abduction/Adduction, subfigure (b) Shoulder Flexion/Extension, subfigure (c) Shoulder Internal/External rotation, subfigure (d) Elbow Flexion, subfigure (e) and (f) The derived shoulder position in the y-axis, which corresponds to the data at the circled position in (a) and
(b). As it may be observed in the subfigure (e) and (f), the sagittal plane model can derive a more accurate shoulder position ps, which in turn will result in relatively precise estimated joint angles when the torso has significant movements, especially in the case of an abrupt change in the slope of the angle.
[00120] To validate the estimated performance of both models across various types of joint angles, four distinct categories of joint angles were examined taking reference to MAE and standard derivation (FIG. 16). Both models exhibit significantly outstanding estimation performance in shoulder flexion/extension compared to other motion patterns, which are 3.320 ± 1.750 and 2.770 ± 1.25 °, respectively. However, there are no significant differences between the two proposed models referring to each motion type.
[00121] B. Effects of Support Force on Human Arm
[00122] In Exp2, the effects of the calculated support force by analyzing muscle activities among three experimental sessions were evaluated. As shown in FIG. 17A, a represented subject’s raw and filtered BB sEMG signals were recorded among different experimental modes. The filtered envelope of the raw signals, shown as an orange solid curve, corresponds to the ’No Robot’s condition, with the raw signals depicted by the blue line. The green and red solid lines respectively represent Exp2-2 and Exp2-3, each conducted under one of the two proposed assistive control strategies. When participants wore the robot and used the proposed assistive control strategy, muscle activity was significantly reduced relative to not wearing the robot. The net change in EMG activity of the 4 muscles per task can be seen in FIG. 17B and FIG. 17C, where each bar represents the specific muscle. The transition from No Robot to With Robot mode, under the fixed torso model (Exp2-2) and the sagittal plane model (Exp2-3), resulted in an average muscular activation reduction as follows: it was 11.43+8.72% and 11.93±10.53% for PM; 31.54±14.12% and 27.17±27.10% for the DM; 57.09±7.85% and 60.18+6.55% for the BB; 7.15 + 3.33% and 5.50 ± 2.04% for the TB.
[00123] The ARAE as proposed robot is compact, portable, and easy to set up, which is suitable for home-based therapy. Moreover, the proposed adaptive arm support control framework can provide the support force with different human arm poses, offering simple implementation and adaptability to diverse users.
[00124] Evaluation of estimated human joint angles
[00125] For comparison of the two proposed angle estimation methods (Expl ), the sagittal plane model is more suitable for estimating joint angles during torso movements. When the shoulder undergoes significant movements in the sagittal plane (FIG. 14), the accuracy of both models decreases as the torso moves forward. However, the sagittal plane model significantly improves the angle estimation accuracy compared to the fixed torso model. This enhancement can make our entire framework more universally applicable, particularly during torso movements. Most importantly, numerous individuals who have suffered from strokes display an excessive use of compensatory trunk motions while reaching and placing objects, which affect the recovery in stroke patients. Therefore, our proposed method can provide accurate joint angle estimation, subsequently enabling the generation of sufficiently precise gravity compensation. Thus, avoiding the trunk compensatory or torso movements improves upper extremity recovery in stroke patients.
[00126] Another hypothesis posits a correlation between the estimation performance of the proposed models and the specific type of motion patterns. As the analysis of the results points out, both models show a significant MAE decrease for the SF angle. This is due to the SF angle only correlates with the elbow position in /-direction spi.(z) and lucai- This demonstrates the capability to predict the elbow joint position with relative accuracy solely based on robot joint position information. Moreover, the estimations of the two proposed models are not significantly different for different joint angles. This may be attributed to the fact that the sagittal plane model is primarily sensitive to large torso movements. However, significant torso
movement occurs only when the subject reaches forward to label 3, and this motion constitutes a relatively small percentage of the entire dataset. Consequently, this results in no significant difference in the performance of the two models in estimating joint angles within the context of the overall analysis. Moreover, the benefit of the proposed models provides the capability to monitor the patient’s arm posture during ADL, which is a main feature for clinical assessments of the upper limb. There are some attempts to use external sensors to measure the upper limb posture when using the 3D end-effector type of rehabilitation robot, such as magnetic sensors and IMUs, or external RGB-D cameras [31]. However, the proposed system and method do not need to rely on external sensors, which reduce the system complexity and enhance the robotic system’s usability.
[00127] One of the observed benefits of the ARAE is the reduction of muscular activity of healthy subjects during simulated ADLs (Exp2). As shown in FIGs. 17A to 17C, the BB activity significantly reduces compared with other muscles. In the Forward Reaching (FR) task, a notable decrease in muscle activity was observed: for the Biceps Brachii (BB), the reduction ranged from -52.64% to -63.75%, and for the Deltoid Muscle (DM), it went from -19.78% to -24.78% when the control mode utilizing the sagittal plane model (Exp2-3) was implemented. [00128] The main reason for this effect may be due to the subjects performing trunk compensatory maneuvers during the FR task because position 3 is located in the farthest position. Therefore, the sagittal plane model is able to better derive the changing shoulder joint position, thus obtaining more accurate joint angle prediction. In turn, it provides more precise arm support during FR tasks. However, from inspection of FIG. 17C, the DM activity has an increase for the LR task when conducting the sagittal plane mode (Exp2-3). This result hypothesizes that the sagittal plane model possesses relatively weak generalization ability when the torso is moving in the later plane. Therefore, the calculated force based on the proposed
control framework might produce extra assistive force in some posture to restrict the arm movement during LR task.
[00129] Two methods were proposed for estimating human joint angles: i) the fixed torso model and ii) the sagittal plane model. The methods were evaluated through experiments involving reaching, placing, and drinking motions with four healthy subjects. To assess the accuracy of these angle estimation models, a comparative analysis was performed between the joint angles projected by the models and the actual joint angles measured by a Motion Capture system (Mocap). Subsequently, the derived joint angles were used in the human dynamics model to calculate the arm support force. To validate the effectiveness of this system in reducing muscle energy in the assistive mode, electromyography (EMG) activities were measured on healthy subjects with and without the robot.
[00130] According to another aspect of the invention, an upper limb assistive method 700 is disclosed according to various embodiments. The method 700 comprises: in 710, receiving a first support position and a first support orientation of a support member in an actuator frame, the support member being coupled to an upper limb of a subject; in 720, determining a shoulder position of the subject based on at least one anthropometric measurement of the subject; in 730, based on the shoulder position of the subject, transforming the first support position and the first support orientation in the actuator frame to a corresponding second support position and a corresponding second support orientation in a subject shoulder frame, wherein the subject shoulder frame corresponds to the shoulder position of the subject; in 740, estimating a plurality of upper limb joint angles corresponding to a plurality of upper limb parts, based on the corresponding second support position and the corresponding second support orientation using a human inverse kinematics model corresponding to the subject; and in 750, determining a gravity-compensated force on the support member based on the plurality of upper limb joint angles.
[00131 J In various embodiments, the method 700 further comprises: transforming the first support position and the first support orientation in the actuator frame to a corresponding intermediate support position and a corresponding intermediate support orientation in a subject pelvis frame; and based on the at least one anthropometric measurement and a sagittal plane in the subject pelvis frame, transforming the intermediate support position and the intermediate support orientation in the subject pelvis frame to the corresponding second support position and the corresponding second support orientation in the subject shoulder frame.
[00132] In various embodiments, the method 700 further comprises: forming a first arc in the sagittal plane with a first center and a first anthropometric measurement as a first radius; and forming a second arc in the sagittal plane with a second center and a second anthropometric measurement as a second radius.
[00133] In various embodiments, the method 700 further comprises: further comprising: determining the shoulder position based on an intersection of the first arc and the second arc in the sagittal plane.
[00134] In accordance with embodiments of the present disclosure, a block diagram representative of components of processing system 1800 that may be provided within controller 120 to carry out the digital signal processing functions or computations in accordance with embodiments of the disclosure, or any other modules or sub-modules of the system is illustrated in FIG. 26. One skilled in the art will recognize that the exact configuration of each processing system provided within these modules may be different and the exact configuration of processing system 1800 may vary and the arrangement illustrated in FIG. 26 is provided by way of example only.
[00135] In embodiments of the disclosure, processing system 1800 may comprise controller 1801 and user interface 1802. User interface 1802 is arranged to enable manual interactions between a user and the computing module as required and for this purpose includes the
input/output components required for the user to enter instructions to provide updates to each of these modules. A person skilled in the art will recognize that components of user interface 1802 may vary from embodiment to embodiment but will typically include one or more of display 1840, keyboard 1835 and optical device 1836.
[00136] Controller 1801 is in data communication with user interface 1802 via bus 1815 and includes memory 1820, processing unit, processing element or processor 1805 mounted on a circuit board that processes instructions and data for performing the method of this embodiment, an operating system 1806, an input/output (I/O) interface 1830 for communicating with user interface 1802 and a communications interface, in this embodiment in the form of a network card 1850. Network card 1850 may, for example, be utilized to send data from these modules via a wired or wireless network to other processing devices or to receive data via the wired or wireless network. Wireless networks that may be utilized by network card 1850 include, but are not limited to, Wireless-Fidelity (Wi-Fi), Bluetooth, Near Field Communication (NFC), cellular networks, satellite networks, telecommunication networks, Wide Area Networks (WAN) and etc.
[00137] Memory 1820 and operating system 1806 are in data communication with processor 1805 via bus 1810. The memory components include both volatile and non-volatile memory and more than one of each type of memory, including Random Access Memory (RAM) 1823, Read Only Memory (ROM) 1825 and a mass storage device 1845, the last comprising one or more solid-state drives (SSDs). One skilled in the art will recognize that the memory components described above comprise non -transitory computer-readable media and shall be taken to comprise all computer-readable media except for a transitory, propagating signal. Typically, the instructions are stored as program code in the memory components but can also be hardwired. Memory 1820 may include a kernel and/or programming modules such as a software application that may be stored in either volatile or non-volatile memory.
[OO138J Herein the term “processor” is used to refer generically to any device or component that can process such instructions and may include: a microprocessor, a processing unit, a plurality of processing elements, a microcontroller, a programmable logic device or any other type of computational device. That is, processor 1805 may be provided by any suitable logic circuitry for receiving inputs, processing them in accordance with instructions stored in memory and generating outputs (for example to the memory components or on display 1840). In this embodiment, processor 1805 may be a single core or multi-core processor with memory addressable space. In one example, processor 1805 may be multi-core, comprising — for example — an 8 core CPU. In another example, it could be a cluster of CPU cores operating in parallel to accelerate computations.
[00139] All examples described herein, whether of methods, materials, or products, are presented for the purpose of illustration and to aid understanding and are not intended to be limiting or exhaustive. Modifications may be made by one of ordinary skill in the art without departing from the scope of the invention as claimed.
Claims
1. An upper limb assistive system, comprising: a support member for supporting an upper limb of a subject; an actuator coupled to the support member to move the support member; and a computing module in signal communication with the actuator, the computing module comprising a processing unit; and a non-transitory media having media storing instructions readable by the processing unit, the media storing instructions that when executed by the processing unit, causes the processing unit to: receive a first support position and a first support orientation of the support member in an actuator frame, the support member being coupled to the upper limb of the subject; determine a shoulder position of the subject based on at least one anthropometric measurement of the subject; based on the shoulder position of the subject, transform the first support position and the first support orientation in the actuator frame to a corresponding second support position and a corresponding second support orientation in a subject shoulder frame, wherein the subject shoulder frame corresponds to the shoulder position of the subject; estimate a plurality of upper limb joint angles corresponding to a plurality of upper limb parts, based on the corresponding second support position and the corresponding second support orientation using a human inverse kinematics model corresponding to the subject; and determine a gravity-compensated force on the support member based on the plurality of upper limb joint angles.
2. The upper limb assistive system as recited in claim 1, wherein the instructions further causes the processing unit to: transform the first support position and the first support orientation in the actuator frame to a corresponding intermediate support position and a corresponding intermediate support orientation in a subject pelvis frame; and based on the at least one anthropometric measurement and a sagittal plane in the subject pelvis frame, transform the intermediate support position and the intermediate support orientation in the subject pelvis frame to the corresponding second support position and the corresponding second support orientation in the subject shoulder frame.
3. The upper limb assistive system as recited in claim 2, wherein the instructions further causes the processing unit to: form a first arc in the sagittal plane with a first center and a first anthropometric measurement as a first radius; and form a second arc in the sagittal plane with a second center and a second anthropometric measurement as a second radius.
4. The upper limb assistive system as recited in claim 3, wherein the instructions further causes the processing unit to: determine the shoulder position based on an intersection of the first arc and the second arc in the sagittal plane.
5. The upper limb assistive system as recited in any one of claims 3 to 4, wherein the first center is at a hip joint position of the subject, and wherein the first anthropometric measurement is a distance between the hip joint position and a shoulder of the subject.
6. The upper limb assistive system as recited in claim 5, wherein the second center is a projected elbow joint in the sagittal plane, and wherein the second anthropometric measurement is a distance between the shoulder and the second center of the subject.
7. The upper limb assistive system as recited in claim 1, wherein the subject shoulder frame is fixed relative to the actuator frame.
8. The upper limb assistive system as recited in any one of claims 1 to 7, wherein the plurality of upper limb joint angles comprises at least one shoulder angle and an elbow angle.
9. The upper limb assistive system as recited in any one of claims 1 to 8, wherein an elbow position and a wrist position of the subject corresponds respectively to the first support position and the first support orientation of the support member.
10. The upper limb assistive system as recited in any one of claims 1 to 9, wherein the first support position and the first support orientation correspond to a forearm position and a forearm orientation of the subject.
11. The upper limb assistive system as recited in any one of claims 1 to 10, wherein the actuator further comprises a plurality of linkage members operably coupled via a plurality of joints.
12. The upper limb assistive system as recited in claim 10, wherein the instructions further causes the processing unit to: determine the first support position and the first support orientation of the support member based on a respective plurality of joint angles of the plurality of joints.
13. The upper limb assistive system as recited in any one of claims 10 to 12, wherein at least one of the plurality of joints is an active joint.
14. The upper limb assistive system as recited in claim 13, wherein at least one of the plurality of joints is a passive joint.
15. The upper limb assistive system as recited in any one of claims 13 to 14, wherein each of the active joint is actuated by a quasi-direct drive motor.
16. The upper limb assistive system as recited in any one of claims 1 to 15, wherein the instructions further causes the processing unit to: actuate the actuator to provide the gravity-compensated force to the support member.
17. An upper limb assistive method, comprising:
receiving a first support position and a first support orientation of a support member in an actuator frame, the support member being coupled to an upper limb of a subject; determining a shoulder position of the subject based on at least one anthropometric measurement of the subject; based on the shoulder position of the subject, transforming the first support position and the first support orientation in the actuator frame to a corresponding second support position and a corresponding second support orientation in a subject shoulder frame, wherein the subject shoulder frame corresponds to the shoulder position of the subject; estimating a plurality of upper limb joint angles corresponding to a plurality of upper limb parts, based on the corresponding second support position and the corresponding second support orientation using a human inverse kinematics model corresponding to the subject; and determining a gravity-compensated force on the support member based on the plurality of upper limb joint angles.
18. The upper limb assistive method as recited in claim 17, further comprising: transforming the first support position and the first support orientation in the actuator frame to a corresponding intermediate support position and a corresponding intermediate support orientation in a subject pelvis frame; and based on the at least one anthropometric measurement and a sagittal plane in the subject pelvis frame, transforming the intermediate support position and the intermediate support orientation in the subject pelvis frame to the
corresponding second support position and the corresponding second support orientation in the subject shoulder frame.
19. The upper limb assistive method as recited in claim 18, further comprising: forming a first arc in the sagittal plane with a first center and a first anthropometric measurement as a first radius; and forming a second arc in the sagittal plane with a second center and a second anthropometric measurement as a second radius.
20. The upper limb assistive method as recited in claim 19, further comprising: determining the shoulder position based on an intersection of the first arc and the second arc in the sagittal plane.
21. The upper limb assistive method as recited in any one of claims 19 to 20, wherein the first center is at a hip joint position of the subject, and wherein the first anthropometric measurement is a distance between the first center and a shoulder of the subject.
22. The upper limb assistive method as recited in claim 21, wherein the second center is a projected elbow joint in the sagittal plane, and wherein the second anthropometric measurement is a distance between the second center and the shoulder of the subject.
23. The upper limb assistive method as recited in claim 18, wherein the subject shoulder frame is fixed relative to the actuator frame.
24. The upper limb assistive method as recited in any one of claims 18 to 23, wherein the plurality of upper limb joint angles comprises at least one shoulder angle and an elbow angle.
25. The upper limb assistive method as recited in any one of claims 18 to 24, wherein an elbow position and a wrist position of the subject corresponds respectively to the first support position and the first support orientation of the support member.
26. The upper limb assistive method as recited in any one of claims 18 to 25, wherein the first support position and the first support orientation correspond to a forearm position and a forearm orientation of the subject.
27. The upper limb assistive method as recited in any one of claims 18 to 26, further comprising actuating an actuator to provide the gravity-compensated force to the support member.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| SG10202302472T | 2023-09-04 | ||
| SG10202302472T | 2023-09-04 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025053788A1 true WO2025053788A1 (en) | 2025-03-13 |
Family
ID=94924569
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/SG2024/050562 Pending WO2025053788A1 (en) | 2023-09-04 | 2024-09-03 | Upper limb assistive system and method |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2025053788A1 (en) |
Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN105437209A (en) * | 2015-12-08 | 2016-03-30 | 东北大学 | Exoskeleton type upper limb rehabilitation robot with man-machine interaction kinetic model |
| US20190083350A1 (en) * | 2016-03-14 | 2019-03-21 | Helmut-Schmidt-Universität / Universität Der Bundeswehr Hamburg | Exoskeleton for a Human Being |
| US20200179212A1 (en) * | 2013-09-27 | 2020-06-11 | Barrett Technology, Llc | Multi-active-axis, non-exoskeletal robotic rehabilitation device |
| CN113171271A (en) * | 2021-04-30 | 2021-07-27 | 华中科技大学 | Gravity compensation method for upper limb rehabilitation robot |
| CN113197752A (en) * | 2021-04-30 | 2021-08-03 | 华中科技大学 | Limb gravity dynamic compensation method of upper limb rehabilitation robot |
| US20210402247A1 (en) * | 2017-05-26 | 2021-12-30 | University Of Melbourne | Electromechanical robotic manipulandum device |
| US20220228710A1 (en) * | 2019-05-02 | 2022-07-21 | Virginia Tech Intellectual Properties, Inc. | Gravity compensation mechanisms and methods |
-
2024
- 2024-09-03 WO PCT/SG2024/050562 patent/WO2025053788A1/en active Pending
Patent Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20200179212A1 (en) * | 2013-09-27 | 2020-06-11 | Barrett Technology, Llc | Multi-active-axis, non-exoskeletal robotic rehabilitation device |
| CN105437209A (en) * | 2015-12-08 | 2016-03-30 | 东北大学 | Exoskeleton type upper limb rehabilitation robot with man-machine interaction kinetic model |
| US20190083350A1 (en) * | 2016-03-14 | 2019-03-21 | Helmut-Schmidt-Universität / Universität Der Bundeswehr Hamburg | Exoskeleton for a Human Being |
| US20210402247A1 (en) * | 2017-05-26 | 2021-12-30 | University Of Melbourne | Electromechanical robotic manipulandum device |
| US20220228710A1 (en) * | 2019-05-02 | 2022-07-21 | Virginia Tech Intellectual Properties, Inc. | Gravity compensation mechanisms and methods |
| CN113171271A (en) * | 2021-04-30 | 2021-07-27 | 华中科技大学 | Gravity compensation method for upper limb rehabilitation robot |
| CN113197752A (en) * | 2021-04-30 | 2021-08-03 | 华中科技大学 | Limb gravity dynamic compensation method of upper limb rehabilitation robot |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| Mao et al. | Human movement training with a cable driven arm exoskeleton (CAREX) | |
| WO2018196227A1 (en) | Evaluation method, device, and system for human motor capacity | |
| US20190201273A1 (en) | Robotic upper limb rehabilitation device | |
| He et al. | Development of a novel autonomous lower extremity exoskeleton robot for walking assistance | |
| TWI549655B (en) | Joint range of motion measuring apparatus and measuring method thereof | |
| WO2018093448A2 (en) | Robotic upper limb rehabilitation device | |
| CN109910024B (en) | Human body posture recognition system for back-holding type transfer nursing robot | |
| Kim et al. | Predicting redundancy of a 7 dof upper limb exoskeleton toward improved transparency between human and robot | |
| Ji et al. | SIAT‐WEXv2: A Wearable Exoskeleton for Reducing Lumbar Load during Lifting Tasks | |
| WO2015190599A1 (en) | Worn movement assistance device | |
| CN106618958A (en) | Somatic sensory controlled upper limb exoskeleton mirrored rehabilitation robot | |
| CN115416003A (en) | An on-demand auxiliary control method for an elderly-oriented lower limb exoskeleton | |
| Saccares et al. | A novel human effort estimation method for knee assistive exoskeletons | |
| Guo et al. | Kinematic analysis of a novel exoskeleton finger rehabilitation robot for stroke patients | |
| CN109887570B (en) | Robot-assisted rehabilitation training method based on RGB-D camera and IMU sensor | |
| Rinaldi et al. | Flexos: A portable, sea-based shoulder exoskeleton with hyper-redundant kinematics for weight lifting assistance | |
| Yi et al. | Enable fully customized assistance: A novel IMU-based motor intent decoding scheme | |
| CN117045239A (en) | Rehabilitation robot patient gravity center identification method based on multi-sensor information fusion | |
| Yang et al. | Adaptive gravity compensation framework based on human upper limb model for assistive robotic arm extender | |
| WO2025053788A1 (en) | Upper limb assistive system and method | |
| Zhu et al. | Design of a passive shoulder lifting exoskeleton of human-machine multi-link | |
| Yang et al. | Design and Evaluation of a Compact 3D End-effector Assistive Robot for Adaptive Arm Support | |
| Yang et al. | CARE Robot: Design and Feasibility Assessment of an End-Effector Based Assistive Robot Manipulator for Adaptive Arm Support | |
| Nassour et al. | Development of a wearable modular imu sensor network suit with a distributed vibrotactile feedback for on-line movement guidance | |
| Yang et al. | Mechanism design and kinematic analysis of a waist and lower limbs cable-driven parallel rehabilitation robot |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24863341 Country of ref document: EP Kind code of ref document: A1 |