[go: up one dir, main page]

WO2017167813A1 - Système et procédé de suivi de position de bras destinés à être utilisés pendant un exercice de flexion d'épaule - Google Patents

Système et procédé de suivi de position de bras destinés à être utilisés pendant un exercice de flexion d'épaule Download PDF

Info

Publication number
WO2017167813A1
WO2017167813A1 PCT/EP2017/057435 EP2017057435W WO2017167813A1 WO 2017167813 A1 WO2017167813 A1 WO 2017167813A1 EP 2017057435 W EP2017057435 W EP 2017057435W WO 2017167813 A1 WO2017167813 A1 WO 2017167813A1
Authority
WO
WIPO (PCT)
Prior art keywords
marker
elbow
hand
velocity
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/EP2017/057435
Other languages
English (en)
Inventor
Cheng Chen
Xin He
Jin Wang
Sheng Jin
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Koninklijke Philips NV
Original Assignee
Koninklijke Philips NV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips NV filed Critical Koninklijke Philips NV
Publication of WO2017167813A1 publication Critical patent/WO2017167813A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/24Aligning, centring, orientation detection or correction of the image
    • G06V10/245Aligning, centring, orientation detection or correction of the image by locating a pattern; Special marks for positioning

Definitions

  • This invention relates to an am position tracking system and method for use during a shoulder flexion exercise.
  • Such shoulder flexion exercises are for example used by patients after losing motor function following a stroke.
  • Stroke is one of the leading causes of death in the world and a vast majority of survivors from stroke suffer from a number of dysfunctions. The most common one is motor dysfunction.
  • patients may be hospitalized for about one month and then discharged to home due to the finite number of therapists and rehabilitation centers.
  • unsupervised rehabilitation systems have been developed. With the help of an unsupervised rehabilitation system, patients can practice a scheduled rehabilitation program outside of hospital.
  • An effective method for motion capture is a visual based tracking system which utilizes visual markers and a camera.
  • the combination of an infra-red camera and retro- reflective markers is currently used to monitor patients' movements during rehabilitation exercises.
  • a camera continuously captures images and markers, for example placed at the shoulder, the elbow, and the hand. These markers can be detected in the captured images because the reflective marker materials make them brighter than other parts of the body and background environment. Using image processing techniques, the upper limb motion can be tracked.
  • FIG. 1 shows in schematic form a captured image of the front of a patient with their arm lowered by their side (left image) and with their arm raised to a straight horizontal outstretched position in front (right image).
  • the patient wears a hand marker 10, an elbow marker 12 and a shoulder marker 14.
  • the left image shows the arm to the side, so all three markers are in the field of view of the camera and not occluded.
  • the arm is raised to point forward as in the right image, the position of the elbow marker, and potentially also the shoulder marker, is lost temporally.
  • a common way of dealing with this issue is to use the position of the elbow marker 12 of the last frame before occlusion as the position of elbow marker during occlusion.
  • this way of dealing with the occlusion problem will cause a jump in the tracking trajectory of elbow, and a jump of the elbow representation when such position is represented on a display. This has a negative effect on the visual experience of users.
  • a tracking system for visual tracking of the position of the arm of a user of the system during a shoulder flexion exercise comprising:
  • a set of markers comprising a shoulder marker, elbow marker and hand marker; an image capture system for capturing an image including the markers; and an image processing system for determining the arm position and movement from the markers, wherein the image processing system is adapted to:
  • This system provides a simple way to track the position of the elbow when occlusion occurs, during a shoulder flexion exercise.
  • the occlusion is for example by the forearm and only the hand marker may be visible.
  • the arm is to be kept straight, and this gives rise to a predictable fixed relationship between the hand velocity and the elbow velocity.
  • the amount of data processing required to provide a prediction of the occluded elbow position is kept to a minimum.
  • the smoothness of tracking the trajectory of the elbow marker position is improved, so that when displayed on an output device, the visual experience of users is enhanced.
  • the fixed relationship for example comprises a constant ratio.
  • the system is typically for facing the user of the system front-on.
  • the system is adapted to apply a weighting factor to allocate a dominance level between the estimated position based on the partially visible area of the occluded elbow marker in the image and the predicted position of the elbow marker for the determination of the final elbow marker position.
  • the weighting factor for example a ratio, can be set to combine the information of predicted position and the estimated position derived from the partially occluded marker in the image to efficiently arrive to a more accurate position estimation.
  • the weighting factor can be associated to the visible area of the occluded marker. For example, the larger the visible area is, the larger weighting factor will be for the estimated position derived from the image.
  • the determination of the position will be more accurate especially in the transition period of the elbow marker from being absolutely occluded to being visible. Accordingly, the estimated motion trajectory of the elbow marker will be more consistent and smooth.
  • the system is adapted to apply a low pass filtering to allocate a dominance level between the current hand marker velocity and previous hand marker velocity for the determination of the current hand marker velocity. Such low pass filtering will lead to a more smooth result of the predicted position of the occluded marker and corresponding motion trajectory.
  • the image processing system may be further adapted to take into account the length of the upper arm and the length of the forearm of the user.
  • the system may be for tracking movement of the body of the user for other exercises.
  • the assumptions used for the elbow position prediction for example that the hand and elbow velocities are proportional, are made only for the shoulder flexion exercise.
  • the overall system may however be used for other exercises as well, for which other assumptions can be made.
  • Examples in accordance with another aspect of the invention provide a method for visual tracking of the position of the arm of a user during a shoulder flexion exercise, wherein the user is provided with a set of markers comprising a shoulder marker, elbow marker and hand marker, wherein the method comprises:
  • image processing to determine the arm position and movement from the markers, wherein the image processing comprises:
  • the fixed relationship preferably comprises a constant ratio.
  • the image is for example captured from a position facing the user front-on.
  • the image processing system may take into account the length of the upper arm and the length of the forearm of the user.
  • the image processing is implemented at least in part in software.
  • Figure 1 shows a shoulder flexion exercise and is used to explain how occlusion of the elbow marker arises
  • Figure 2 shows how the arm movement modeled
  • Figure 3 shows the relationship between hand and elbow velocity as a function of elbow angle
  • Figure 4 shows a tracking system
  • Figure 5 shows the results of the tracking
  • Figure 6 shows a general computer architecture for performing the processing of the system of Figure 4.
  • the invention provides a tracking system for visual tracking of the position of the arm of a user of the system during a shoulder flexion exercise.
  • Shoulder, elbow and hand markers are captured in an image and image processing is used to determine the arm position and movement from the markers.
  • An elbow marker position is determined, when the elbow marker is occluded, by extrapolation of the elbow marker position before occlusion and taking into account the hand marker velocity by assuming a fixed relationship between the hand and elbow marker velocities. This provides a simple way to model the elbow movement during occlusion.
  • This approach involves predicting the elbow marker position with a prediction model.
  • the prediction model is designed based on the assumption of a constant ratio between the velocities of the hand and the elbow during shoulder flexion.
  • Figure 2 shows an arm in side view, with a shoulder marker Ml, an elbow marker M2 and a hand marker M3, with a camera 20 in front of the person.
  • a shoulder marker Ml if the elbow joint is maintained stretched out straight when performing shoulder flexion, the ratio between elbow velocity and hand velocity is equal to the ratio between
  • the elbow trajectory may be predicted faithfully from the elbow position before occlusion occurred and the current hand velocity:
  • Equation 3 The velocity ratio between the hand and the elbow was calculated by Equation 3 below:
  • Figure 3 shows the velocity ratio between the hand and elbow as a
  • Equation 3 the velocity ratio is bounded between 1.4 and 2 when the elbow joint angle ranges from 90° to 180°.
  • the velocity ratio between the hand and elbow is a constant defined as in Equation 3.
  • Figure 4 shows a tracking system for visual tracking of the position of the arm of a user of the system during a shoulder flexion exercise.
  • the system comprises the set of markers comprising the shoulder marker 14, elbow marker 12 and hand marker 10 and an image capture system 20 for capturing an image including the markers.
  • a image processing system 22 determines the arm position and movement from the markers.
  • An elbow marker position is worked out when the elbow marker is occluded, by extrapolation of the elbow marker position before occlusion and taking into account the hand marker velocity by assuming the fixed relationship between the hand and elbow marker velocities as explained above.
  • the marker positions detected as well as interpolated during occlusion are presented on a display 24.
  • M2. x(n) and M2.y(n) are the real-time coordinates of the elbow marker (M2) position detected from the n th frame (resolved in the x- and y-directions); and M3. x(n) and M3. y (n) are the real-time coordinates of the hand marker (M3) position detected from the n th frame (resolved in the x- and y-directions). are the real-time coordinates of the hand marker (M3)
  • n-1 refer to the previous frame.
  • a first step is to perform low pass filtering of the elbow marker (M2) trajectory.
  • Low pass filtered data is denoted by a subscript "1" :
  • a second step is to perform low pass filtering of the hand marker (M3) trajectory:
  • the marker positions are recorded in a two dimensional Cartesian form in this example, named the x-axis and y-axis.
  • a third step is to calculate the hand marker (M3) velocity (resolved to the two axis directions):
  • the elbow position can then be predicted based on the hand velocity. Predicted values are denoted by a subscript "p" :
  • This calibration process involves the patient adopting the finished position of the exercise (with their arm(s) extended by their side) so that the three markers are identified and the arm length can then be derived from the three markers when the patient starts to use the system.
  • the finished position of the exercise with their arm(s) extended by their side
  • the arm length can then be derived from the three markers when the patient starts to use the system.
  • the low pass filtered values M2ix(n) and M2iy(n) remain non-zero as a result of the low pass filtering which takes account of previous values.
  • M2x(n) and M2y(n) are non-zero so that there is not occlusion, the actual values can be used.
  • the predicted values can during this time be set equal to the measured and low pass filtered values. Only when the measured values become zero during occlusion do the prediction equations above need to be used.
  • the prediction is based on the previous elbow position and the current hand velocity, scaled by the (reciprocal of the) ratio of Equation 3.
  • the coefficient settings are chosen to best model the elbow movement. They each lie between 0 and 1. Note that the use of the coefficient Coef 4 is entirely optional and is a fine tuning refinement.
  • Figure 5 shows test results of elbow marker trajectory before and after using the prediction model explained above, for a subject performing a shoulder flexion exercise with elbow joint 180°.
  • the horizontal axis represents time, which is indicated by the order number of frames continuously taken by the image capture system 20.
  • the top image shows the hand marker velocity as plot 50 and the elbow marker velocity as plot 52.
  • the ratio is shown as plot 54.
  • the velocities are all resolved to one axis direction (i.e. one equation of each pair above).
  • the unit of velocity value on vertical axis is pixels per second, which indicates the speed of each marker in term of the number of pixels it crossed per second between the frames.
  • the measured elbow marker velocity 52 is zero.
  • the bottom image shows the tracking trajectory of the elbow marker without and with the prediction model.
  • the vertical axis represents the value of y coordinate of the elbow marker in each frame taken by the image capture system 20.
  • Plot 56 is without the prediction.
  • Plot 59 shows the prediction model. It overlaps with the more basic model when there is no occlusion, but during occlusion the interpolation can be seen, giving a smoother tracking of the elbow marker position.
  • the comparison results show that the prediction model improves the estimate of elbow marker position during occlusion and removes the jump in the tracking trajectory of the elbow marker position.
  • elbow position is for use during a straight arm shoulder flexion.
  • other models may be used.
  • the system may have the capability of being used for different exercises, and the approach above may then be just one mode of operation.
  • the system described above makes use of a processor for processing data.
  • Figure 6 illustrates an example of a computer 60 for implementing the controller or processor described above.
  • the computer 60 includes, but is not limited to, PCs, workstations, laptops, PDAs, palm devices, servers, storages, and the like.
  • the computer 60 may include one or more processors 61 , memory 62, and one or more I/O devices 63 that are communicatively coupled via a local interface (not shown).
  • the local interface can be, for example but not limited to, one or more buses or other wired or wireless connections, as is known in the art.
  • the local interface may have additional elements, such as controllers, buffers (caches), drivers, repeaters, and receivers, to enable communications. Further, the local interface may include address, control, and/or data connections to enable appropriate communications among the aforementioned components.
  • the processor 61 is a hardware device for executing software that can be stored in the memory 62.
  • the processor 61 can be virtually any custom made or commercially available processor, a central processing unit (CPU), a digital signal processor (DSP), or an auxiliary processor among several processors associated with the computer 60, and the processor 61 may be a semiconductor based microprocessor (in the form of a microchip) or a microprocessor.
  • the memory 62 can include any one or combination of volatile memory elements
  • RAM random access memory
  • DRAM dynamic random access memory
  • SRAM static random access memory
  • non-volatile memory elements e.g., ROM, erasable programmable read only memory (EPROM), electronically erasable programmable read only memory (EEPROM), programmable read only memory (PROM), tape, compact disc read only memory (CD-ROM), disk, diskette, cartridge, cassette or the like, etc.
  • the memory 62 may incorporate electronic, magnetic, optical, and/or other types of storage media. Note that the memory 62 can have a distributed architecture, where various components are situated remote from one another, but can be accessed by the processor 61.
  • the software in the memory 62 may include one or more separate programs, each of which comprises an ordered listing of executable instructions for implementing logical functions.
  • the software in the memory 62 includes a suitable operating system (O/S) 64, compiler 65, source code 66, and one or more applications 67 in accordance with exemplary embodiments.
  • O/S operating system
  • compiler 65 compiler 65
  • source code 66 source code 66
  • applications 67 application 67 in accordance with exemplary embodiments.
  • the application 67 comprises numerous functional components such as computational units, logic, functional units, processes, operations, virtual entities, and/or modules.
  • the operating system 64 controls the execution of computer programs, and provides scheduling, input-output control, file and data management, memory management, and communication control and related services.
  • Application 67 may be a source program, executable program (object code), script, or any other entity comprising a set of instructions to be performed.
  • a source program then the program is usually translated via a compiler (such as the compiler 65), assembler, interpreter, or the like, which may or may not be included within the memory 62, so as to operate properly in connection with the operating system 64.
  • the application 67 can be written as an object oriented programming language, which has classes of data and methods, or a procedure programming language, which has routines, subroutines, and/or functions, for example but not limited to, C, C++, C#, Pascal, BASIC, API calls, HTML, XHTML, XML, ASP scripts, JavaScript, FORTRAN, COBOL, Perl, Java, ADA, .NET, and the like.
  • the I/O devices 63 may include input devices such as, for example but not limited to, a mouse, keyboard, scanner, microphone, camera, etc. Furthermore, the I/O devices 63 may also include output devices, for example but not limited to a printer, display, etc. Finally, the I/O devices 63 may further include devices that communicate both inputs and outputs, for instance but not limited to, a network interface controller (NIC) or
  • NIC network interface controller
  • the I/O devices 63 also include components for communicating over various networks, such as the Internet or intranet.
  • the processor 61 When the computer 60 is in operation, the processor 61 is configured to execute software stored within the memory 62, to communicate data to and from the memory 62, and to generally control operations of the computer 60 pursuant to the software.
  • the application 67 and the operating system 64 are read, in whole or in part, by the processor 61, perhaps buffered within the processor 61, and then executed.
  • a computer readable medium may be an electronic, magnetic, optical, or other physical device or means that can contain or store a computer program for use by or in connection with a computer related system or method.
  • the system may use multiple markers at the different joints, and it may of course process images of both arms either sequentially or simultaneously.
  • the invention relates specifically to the use of relative hand and elbow marker velocities to predict elbow movement during occlusion.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

L'invention concerne un système de suivi permettant de suivre visuellement la position du bras d'un utilisateur du système pendant un exercice de flexion d'épaule. Des marqueurs d'épaule, de coude et de main sont capturés dans une image, puis un traitement d'image est utilisé pour déterminer la position et le mouvement du bras à partir des marqueurs. Lorsque le marqueur du coude est masqué, une position de marqueur de coude est déterminée en extrapolant la position du marqueur du coude avant l'occlusion et en tenant compte de la vitesse du marqueur de la main en supposant que la relation est fixe entre les vitesses des marqueurs de la main et du coude. L'invention permet de modéliser facilement le mouvement du coude pendant l'occlusion.
PCT/EP2017/057435 2016-03-30 2017-03-29 Système et procédé de suivi de position de bras destinés à être utilisés pendant un exercice de flexion d'épaule Ceased WO2017167813A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CNPCT/CN2016/077828 2016-03-30
CN2016077828 2016-03-30
EP16168923 2016-05-10
EP16168923.7 2016-05-10

Publications (1)

Publication Number Publication Date
WO2017167813A1 true WO2017167813A1 (fr) 2017-10-05

Family

ID=58448562

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2017/057435 Ceased WO2017167813A1 (fr) 2016-03-30 2017-03-29 Système et procédé de suivi de position de bras destinés à être utilisés pendant un exercice de flexion d'épaule

Country Status (1)

Country Link
WO (1) WO2017167813A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110826385A (zh) * 2018-06-07 2020-02-21 皇家飞利浦有限公司 康复设备及方法
EP3621083A1 (fr) * 2018-09-10 2020-03-11 Koninklijke Philips N.V. Dispositif et procédé de réhabilitation
WO2022037281A1 (fr) * 2020-08-19 2022-02-24 广西电网有限责任公司贺州供电局 Système de surveillance et d'alarme de fonctionnement de champ de sous-station en temps réel basé sur la vision artificielle

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110317871A1 (en) * 2010-06-29 2011-12-29 Microsoft Corporation Skeletal joint recognition and tracking system

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110317871A1 (en) * 2010-06-29 2011-12-29 Microsoft Corporation Skeletal joint recognition and tracking system

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
"Stochastic models, estimation, and control", 1 January 1979, ISBN: 978-0-12-480701-3, article PETER S MAYBECK: "Chapter 1: Introduction to Kalman-Filter", pages: 1 - 16, XP055373366 *
ARISTIDOU A ET AL: "Real-Time Estimation of Missing Markers in Human Motion Capture", BIOINFORMATICS AND BIOMEDICAL ENGINEERING, 2008. ICBBE 2008. THE 2ND INTERNATIONAL CONFERENCE ON, IEEE, PISCATAWAY, NJ, USA, 16 May 2008 (2008-05-16), pages 1343 - 1346, XP031267551, ISBN: 978-1-4244-1747-6 *
VAN-HANH NGUYEN ET AL: "Science Arts & Métiers (SAM) TRAINING BASED ON REAL-TIME MOTION EVALUATION FOR FUNCTIONAL REHABILITATION IN VIRTUAL ENVIRONMENT", INTERNATIONAL JOURNAL ON IMAGE AND GRAPHICS, 1 April 2010 (2010-04-01), pages 235 - 250, XP055309394, Retrieved from the Internet <URL:https://hal.archives-ouvertes.fr/hal-01110998/document> [retrieved on 20161011] *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110826385A (zh) * 2018-06-07 2020-02-21 皇家飞利浦有限公司 康复设备及方法
EP3621083A1 (fr) * 2018-09-10 2020-03-11 Koninklijke Philips N.V. Dispositif et procédé de réhabilitation
WO2022037281A1 (fr) * 2020-08-19 2022-02-24 广西电网有限责任公司贺州供电局 Système de surveillance et d'alarme de fonctionnement de champ de sous-station en temps réel basé sur la vision artificielle

Similar Documents

Publication Publication Date Title
US10817795B2 (en) Handstate reconstruction based on multiple inputs
KR101606628B1 (ko) 포인팅 방향 검출 장치 및 그 방법과, 프로그램 및 컴퓨터 판독가능한 매체
US10394318B2 (en) Scene analysis for improved eye tracking
CN113393489A (zh) 基于视觉的关节动作和姿态运动预测的系统、方法和介质
WO2020132110A9 (fr) Appareil d&#39;entraînement de sports virtuels et de bien-être entièrement interactif en temps réel et système de physiothérapie
CN110801233B (zh) 一种人体步态监测方法及装置
CN109821239B (zh) 体感游戏的实现方法、装置、设备及存储介质
JP7488846B2 (ja) 連合学習機構を利用した画像IoTプラットフォーム
WO2007020568A2 (fr) Systeme et procede permettant d&#39;analyser les mouvements d&#39;un utilisateur
CN106774862B (zh) 基于视线的vr显示方法及vr设备
CN111862150A (zh) 图像跟踪的方法、装置、ar设备和计算机设备
CN110051319A (zh) 眼球追踪传感器的调节方法、装置、设备及存储介质
CN109781104B (zh) 运动姿态确定及定位方法、装置、计算机设备及介质
CN105892658A (zh) 基于头戴显示设备预测头部姿态的方法和头戴显示设备
WO2017167813A1 (fr) Système et procédé de suivi de position de bras destinés à être utilisés pendant un exercice de flexion d&#39;épaule
Jiang et al. A SLAM-based 6DoF controller with smooth auto-calibration for virtual reality
Edwards et al. Low-latency filtering of kinect skeleton data for video game control
Guzov et al. HMD^ 2: Environment-aware Motion Generation from Single Egocentric Head-Mounted Device
US20240169498A1 (en) Joint Video Stabilization and Motion Deblurring
CN104898823B (zh) 控制视标运动的方法和装置
CN109333527B (zh) 一种与机器人的交互方法、装置、电子设备及存储介质
CN118591818A (zh) 使用多个成像传感器在数字图像中进行面部未失真的方法和电子设备
CN113761965B (zh) 动作捕捉方法、装置、电子设备和存储介质
KR102466996B1 (ko) 눈 위치 예측 방법 및 장치
CN118648892A (zh) 一种人体跌倒预测方法、装置、电子设备及存储介质

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17714225

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 17714225

Country of ref document: EP

Kind code of ref document: A1