WO2010036091A2 - A system and a method for identifying human behavioural intention based on an effective motion analysis - Google Patents
A system and a method for identifying human behavioural intention based on an effective motion analysis Download PDFInfo
- Publication number
- WO2010036091A2 WO2010036091A2 PCT/MY2009/000153 MY2009000153W WO2010036091A2 WO 2010036091 A2 WO2010036091 A2 WO 2010036091A2 MY 2009000153 W MY2009000153 W MY 2009000153W WO 2010036091 A2 WO2010036091 A2 WO 2010036091A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- component
- activity
- distance
- movement
- gait
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19602—Image analysis to detect motion of the intruder, e.g. by frame subtraction
- G08B13/19613—Recognition of a predetermined image pattern or behaviour pattern indicating theft or intrusion
- G08B13/19615—Recognition of a predetermined image pattern or behaviour pattern indicating theft or intrusion wherein said pattern is defined by the user
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/23—Recognition of whole body movements, e.g. for sport training
- G06V40/25—Recognition of walking or running movements, e.g. gait recognition
Definitions
- the present invention relates to a system and a method for identifying human behavioral intention based on effective human motion analysis.
- Passive reflective markers are placed on subjects in' this case, human; at specific anatomical landmarks. As the subjects walk through a lab, a three-dimensional location of each marker is detected by multiple infrared cameras. A biomechanics model is applied to the marker series to calculate the three-dimensional motion of each body segment. The processed data generates a graphical representation of each joint in all three planes and is expressed in terms of a gait cycle.
- the present invention provides a system for identifying a behavioral intention of a human being based on an effective motion analysis
- the system includes an image acquisition component, wherein the image acquisition component acquires a plurality of j I object images in sequence
- an activity enrollment component includes a background and foreground detection component, an object detection component, an object partitioning component, a key point extraction component, a gait feature extraction component, an activity registration component and an activity storage means
- an activity detection component includes a background and foreground detection component, an object detection component, an object partitioning component, a key point extraction component, a gait
- I feature extraction component compute features related to the movement of an upper limb of the human in relation to the movement of a lower limb in a gait cycle, wherein the gait cycle consists of a stance phase and a swing phase.
- the present invention also provides a' method for identifying a behavioral feature extraction component, an activity matching ' component and an activity storage means characterized in that the gait feature extraction components compute features related to the movement of an upper limb of the human in relation to the movement of a lower limb in a gait cycle, wherein the gait cycle consists of a stance phase and a swing phase.
- the present invention also provides a' method for identifying a behavioral
- the method includes acquiring a plurality of object images in sequence, 'enrolling data in a background and i foreground detection component, an object detection component, an object partitioning component, a key point extraction component, a gait feature extraction component, an activity registration component and an activity storage means and detecting features and matching features using a background and foreground detection component, an object detection component, an object partitioning component, a key point extraction component, a gait feature extraction component, an activity matchirig component and an activity storage means characterized in that the method further includes calculating features related to the movement of an upper limb of the human in relation to the movement of a lower limb in a gait cycle using the gait feature extraction components wherein the gait cycle consists of a stance phase and a swing phase.
- Fig. 1 is a block diagram representation of a system and a method for identifying behavioral intention of a human being based on an effective motion analysis based on the preferred embodiment of the present invention
- Fig. 2 shows a representation of an upper limb motion 'in a stance phase of a gait cycle
- Fig. 3 shows a representation of an upper limb motion 'in a swing phase of a gait cycle
- Fig. 4 is a comparison table between upper limb movement and lower limb movement in a typical gait cycle
- Fig. 10 is a diagram illustrating the computation of a distance of an upper limb movement in a swing phase
- Fig. 11 is a diagram illustrating the computation of a distance of a lower limb movement in a swing phase.
- an object partitioning component (52) divides the object of interest into four main parts which are head, torso, arms and legs. From the arms and legs, a key point extraction component (53) computes .important points on these two parts to
- Vhe important points may include the corner points, high curvature points and joining points that are detected from the outline of the arms and legs.
- the gait feature extraction component (54) computes ' the features related to movement of the arms and legs.
- the computed gait features from ; th Ie sequence of images are registered to a particular activity or intention, through an activity registration component (55).
- the gait features and the registered activity or intention are stored in the activity database (56).
- Fig. 6 depicts a detailed architecture of the activity detection component with an assumption that the raw video images (68) from the image acquisition component are available in real
- the computed gait features from the sequence of video images (68) are compared with the registered gait features in the activity database (67). A matching process will be
- Table 2 Six main features in the gait extraction component.
- the distance of arm movement in one gait cycle is computed by adding up the circular distances from feature 1 with feature 3. Basically,
- the distance of leg movement in one gait cycle is computed by adding the horizontal distances from feature 2 to that of feature 4. Basically,
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- General Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- Social Psychology (AREA)
- Psychiatry (AREA)
- Health & Medical Sciences (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
- Closed-Circuit Television Systems (AREA)
- Image Analysis (AREA)
Abstract
With the growing market for video surveillance in security area, there is a need for an automated system which provides a way to track and detect human intention based on a particular human motion. The present invention relates to a system and a method for identifying human behavioral intention based on effective motion analysis wherein, the system obtains a sequence of raw images taken from live scene and processes the raw images in an activity analysis component. The activity analysis component is further provided with an activity enrollment component and activity detection component.
Description
A SYSTEM AND A METHOD FOR IDENTIFYING hjUMAN BEHAVIOURAL INTENTION BASED ON AN EFFECTIVE MOTION ANALYSIS
FIELD OF INVENTION
The present invention relates to a system and a method for identifying human behavioral intention based on effective human motion analysis.
BACKGROUND OF INVENTION
There is a growing interest in activity analysis or predicting human intention based on human motion due to immense need for better automate , d I! video surveillance systems that go beyond just merely identifying simple objects. The capability to automatically monitor human motion using computers in security-sensitive areas ' s iuch as airports, borders and building lobbies is of great interest to security personnel, e.g. police and military.
A recent market study from IMS Research has found that the trend from analogue CCTV to
■ i network video surveillance is in full swing. The world market for network video surveillance products increased by an impressive 41.9% in 2006 Jϊnd is forecasted to continue growing strongly for many years to come. By 2010, the combined market for network cameras, video
; i ' ■ servers and NVRs is forecasted to exceed US$2.6 billion.
In China, the market for video servers (video encoders) used in security applications increased by a massive 60% in 2007, according to a 'report titled "The Chinese Market for CCTV and Video Surveillance Equipment" by IMS Research. The market is forecasted to continue growing rapidly over the coming years to exceed US$150 million by 2011. This i j dramatic growth is primarily attributed to the strong demand for IP-based video surveillance systems in China. More and more users of security systems are choosing networked solutions based on video servers, instead of traditional analogue CCTV systems.
30.3% forecast for video camera servers. Together^ these markets will be worth some
EUR151.1 million by 2008.
These are summarized in table 1.
Traditionally, motion analysis of humans has used markers attached to appropriate parts of a human body to highlight the movement of these points and how they relate to each motion
sequence. These were extensively used in sports 'to enhance performance of athletes. i I
Passive reflective markers are placed on subjects in' this case, human; at specific anatomical landmarks. As the subjects walk through a lab, a three-dimensional location of each marker is detected by multiple infrared cameras. A biomechanics model is applied to the marker series to calculate the three-dimensional motion of each body segment. The processed data generates a graphical representation of each joint in all three planes and is expressed in terms of a gait cycle.
Conf. on Computer Vision and Pattern Recognition, pages 697-703, Los Alamitos, CA, June 1997. IEEE Computer Society Press.
SUWI WlARY OF INVENTION
I ! • Accordingly, the present invention provides a system for identifying a behavioral intention of a human being based on an effective motion analysis, the system includes an image acquisition component, wherein the image acquisition component acquires a plurality of j I object images in sequence, an activity enrollment component includes a background and foreground detection component, an object detection component, an object partitioning component, a key point extraction component, a gait feature extraction component, an activity registration component and an activity storage means and an activity detection component includes a background and foreground detection component, an object detection component, an object partitioning component, a key point extraction component, a gait
• / : I feature extraction component, an activity matching ' component and an activity storage means characterized in that the gait feature extraction components compute features related to the movement of an upper limb of the human in relation to the movement of a lower limb in a gait cycle, wherein the gait cycle consists of a stance phase and a swing phase. j Furthermore, the present invention also provides a' method for identifying a behavioral
■ I intention of a human being based on an effective motion analysis, the method includes acquiring a plurality of object images in sequence, 'enrolling data in a background and i foreground detection component, an object detection component, an object partitioning component, a key point extraction component, a gait feature extraction component, an activity registration component and an activity storage means and detecting features and matching features using a background and foreground detection component, an object detection component, an object partitioning component, a key point extraction component, a gait feature extraction component, an activity matchirig component and an activity storage means characterized in that the method further includes calculating features related to the movement of an upper limb of the human in relation to the movement of a lower limb in a
gait cycle using the gait feature extraction components wherein the gait cycle consists of a stance phase and a swing phase.
BRIEF DESCRIPTION OF THE DRAWINGS
The present invention will be fully understood from 'the detailed description given herein below and the accompanying drawings which are given by way of illustration only, and thus are not limitative of the present invention, wherein:
Fig. 1 is a block diagram representation of a system and a method for identifying behavioral intention of a human being based on an effective motion analysis based on the preferred embodiment of the present invention;
Fig. 2 shows a representation of an upper limb motion 'in a stance phase of a gait cycle; i
Fig. 3 shows a representation of an upper limb motion 'in a swing phase of a gait cycle;
Fig. 4 is a comparison table between upper limb movement and lower limb movement in a typical gait cycle;
Fig. 10 is a diagram illustrating the computation of a distance of an upper limb movement in a swing phase;
Fig. 11 is a diagram illustrating the computation of a distance of a lower limb movement in a swing phase.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
From the detected object of interest, an object partitioning component (52) divides the object of interest into four main parts which are head, torso, arms and legs. From the arms and legs, a key point extraction component (53) computes .important points on these two parts to
; j be used in a gait feature extraction component (54). Vhe important points may include the corner points, high curvature points and joining points that are detected from the outline of the arms and legs.
The gait feature extraction component (54) computes' the features related to movement of the arms and legs. The computed gait features from ; th Ie sequence of images are registered to a particular activity or intention, through an activity registration component (55). The gait features and the registered activity or intention are stored in the activity database (56).
Fig. 6 depicts a detailed architecture of the activity detection component with an assumption that the raw video images (68) from the image acquisition component are available in real
' i time. The video images (68) are fed into the similar components that are describes in the
! i activity enrollment component (refer to Fig. 5) except the activity registration component (55) is replaced with an activity matching component (65). In the activity matching component
(65), the computed gait features from the sequence of video images (68) are compared with the registered gait features in the activity database (67). A matching process will be
■ conducted to match the activity or intention that has Jbeen registered and the gait features detected and vice versa.
phase.
For feature 5, the distance of arm movement in one gait cycle is computed by adding up the circular distances from feature 1 with feature 3. Basically,
Feature 5 = Z)""" + D^"
where,
D°™ : Distance of arm movement in a stance phase
D™' : Distance of arm movement in a swing phase
For feature 6, the distance of leg movement in one gait cycle is computed by adding the horizontal distances from feature 2 to that of feature 4. Basically,
Feature 6 = Ds lf + D^
where,
D£S : Distance of leg movement in a stance phase
D^1 : Distance of leg movement in a swing phase
Claims
1. A system for identifying a behavioral intention of a human being based on an effective motion analysis, the system includes:
a. an image acquisition component, wherein the image acquisition component
I acquires a plurality of object images in siequence;
I b. an activity enrollment component includes a background and foreground detection component. (50), an object detection component (51), an object partitioning component (52), a key point extraction component (53), a gait feature extraction component (54), an activity registration component (55) and an activity storage means (56) and
c. an activity detection component includes a background and foreground i detection component (60), an object detection component (61), an object
■ i partitioning component (62) , a key point extraction component (63), a gait feature extraction component (64), an activity matching component (65) and an activity storage means (67)
characterized in that
the gait feature extraction components (54, 64) compute features related to the movement of an upper limb of the human in relation to the movement of a lower limb j in a gait cycle, wherein the gait cycle consists of a stance phase and a swing phase.
2. The system as claimed in claim 1, wherein the background and foreground
I components (50, 60) determine presence of moving objects.
3. The system as claimed in claim.2, wherein the object detection components (51, 61)
I highlight the moving object as an object of interest
4. The system as claimed in claim 3, wherein the object partitioning components (52,
I
62) divide the object of interest into four main parts: 1) head, 2) torso, 3) upper limbs and 4) lower limbs. . .
5. The system as claimed in claim 1, wherein the key point extraction component (53,
63) compute important points of the four main parts of the divided object from the i object partitioning component.
i
6. The system as claimed in claim 1, wherein the gait feature extraction components
(54, 64) extract one or more of the following features:
a. distance of upper limb movement in stance phase; b. distance of lower limb movement in stance phase; c. distance of upper limb movement in swing phase; d. distance of lower limb movement in swing phase; e. distance of upper limb movement in one gait cycle; or f. distance of lower limb movement in one gait cycle.
7. The system as claimed in claim 1 , wherein a plurality of object images in sequence is
. i ■ acquired by means of a video camera or a still camera.
8. The system as claimed in claim 1, wherein the computation processes are done using a computing means such as a computer.
9. A method for identifying a behavioral intention of a human being based on an effective motion analysis, the method includes:
a. acquiring a plurality of object images in sequence,
b. enrolling data in a background and foreground detection component (50), an object detection component (51), an object partitioning component (52), a key point extraction component (53), a gait feature extraction component (54), an activity registration component (55) and an activity storage means (56) and,
c. detecting features and matching features using a background and foreground i detection component (60), an object detection component (61), an object partitioning component (62), a key point extraction component (63), a gait feature extraction component (64), an activity matching component (65) and an activity storage means (67)
characterized in that
the method further includes calculating features related to the movement of an upper
! limb of the human in relation to the movement of a lower limb in a gait cycle using the i gait feature extraction components (54, 64) wherein the gait cycle consists of a stance phase and a swing phase.
10. The method as claimed in claim 9, wherein the background and foreground components (50, 60) determine presence of moving objects.
11. The method as claimed in claim 9, wherein the object detection components (51, 61) i highlight the moving object as an object of interest.
12. The method as claimed in claim 9, wherein the object partitioning components (52, 62) divide the object of interest into four main parts: 1) head, 2) torso, 3) upper limbs and 4) lower limbs.'
13. The method as claimed in claim 9, wherein the key point extraction component (53, 63) compute important points of the four- main parts of the divided . object from the object partitioning component.
14. The method as claimed in claim 9, wherein the activity enrollment component is performed in offline mode.
15. The method as claimed in claim 9, wherein the gait feature extraction components j (54, 64) extract one or more of the following features:
a. distance of upper limb movement in stance phase; b. distance of lower limb movement in stance phase;
J c. distance of upper limb movement in swing phase; i d. distance of lower limb movement in swing phase; j e. distance of upper limb movement in one; gait cycle; or f. distance of lower limb movement in one! gait cycle.
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN2009801470533A CN102224526A (en) | 2008-09-24 | 2009-09-18 | System and method for identifying human behavioral intentions based on effective motion analysis |
| EP09816484.1A EP2327057A4 (en) | 2008-09-24 | 2009-09-18 | A system and a method for identifying human behavioural intention based on an effective motion analysis |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| MYPI20083761A MY159289A (en) | 2008-09-24 | 2008-09-24 | A system and a method for identifying human behavioural intention based on an effective motion analysis |
| MYPI20083761 | 2008-09-24 |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| WO2010036091A2 true WO2010036091A2 (en) | 2010-04-01 |
| WO2010036091A3 WO2010036091A3 (en) | 2010-06-24 |
Family
ID=42060321
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/MY2009/000153 Ceased WO2010036091A2 (en) | 2008-09-24 | 2009-09-18 | A system and a method for identifying human behavioural intention based on an effective motion analysis |
Country Status (4)
| Country | Link |
|---|---|
| EP (1) | EP2327057A4 (en) |
| CN (1) | CN102224526A (en) |
| MY (1) | MY159289A (en) |
| WO (1) | WO2010036091A2 (en) |
Cited By (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20110092248A1 (en) * | 2009-10-21 | 2011-04-21 | Xerox Corporation | Portable security system built into cell phones |
| CN102119877A (en) * | 2010-12-15 | 2011-07-13 | 河北工业大学 | Method for creating expert knowledge base for automatically training lower artificial limbs |
| WO2013122675A3 (en) * | 2011-12-16 | 2013-11-28 | The Research Foundation For The State University Of New York | Methods of recognizing activity in video |
| CN103886588A (en) * | 2014-02-26 | 2014-06-25 | 浙江大学 | Feature extraction method of three-dimensional human body posture projection |
| US10783362B2 (en) | 2017-11-03 | 2020-09-22 | Alibaba Group Holding Limited | Method and apparatus for recognizing illegal behavior in unattended scenario |
| CN112464734A (en) * | 2020-11-04 | 2021-03-09 | 昆明理工大学 | Vision-based quadruped animal walking motion characteristic automatic identification method |
Families Citing this family (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP5686108B2 (en) * | 2012-02-24 | 2015-03-18 | 株式会社ダイフク | Sorting equipment provided with an erroneous work prevention device and an erroneous work prevention device |
| CN102881100B (en) * | 2012-08-24 | 2017-07-07 | 济南纳维信息技术有限公司 | Entity StoreFront anti-thefting monitoring method based on video analysis |
| CN107423730B (en) * | 2017-09-20 | 2024-02-13 | 湖南师范大学 | Human gait behavior active detection and recognition system and method based on semantic folding |
| US10387737B1 (en) * | 2018-02-02 | 2019-08-20 | GM Global Technology Operations LLC | Rider rating systems and methods for shared autonomous vehicles |
| JP7283037B2 (en) * | 2018-07-26 | 2023-05-30 | ソニーグループ株式会社 | Information processing device, information processing method, and program |
Family Cites Families (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6369794B1 (en) * | 1998-09-09 | 2002-04-09 | Matsushita Electric Industrial Co., Ltd. | Operation indication outputting device for giving operation indication according to type of user's action |
| US7330566B2 (en) * | 2003-05-15 | 2008-02-12 | Microsoft Corporation | Video-based gait recognition |
| US7212651B2 (en) * | 2003-06-17 | 2007-05-01 | Mitsubishi Electric Research Laboratories, Inc. | Detecting pedestrians using patterns of motion and appearance in videos |
| JP5028751B2 (en) * | 2005-06-09 | 2012-09-19 | ソニー株式会社 | Action recognition device |
-
2008
- 2008-09-24 MY MYPI20083761A patent/MY159289A/en unknown
-
2009
- 2009-09-18 CN CN2009801470533A patent/CN102224526A/en active Pending
- 2009-09-18 WO PCT/MY2009/000153 patent/WO2010036091A2/en not_active Ceased
- 2009-09-18 EP EP09816484.1A patent/EP2327057A4/en not_active Withdrawn
Cited By (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20110092248A1 (en) * | 2009-10-21 | 2011-04-21 | Xerox Corporation | Portable security system built into cell phones |
| US8744522B2 (en) * | 2009-10-21 | 2014-06-03 | Xerox Corporation | Portable security system built into cell phones |
| CN102119877A (en) * | 2010-12-15 | 2011-07-13 | 河北工业大学 | Method for creating expert knowledge base for automatically training lower artificial limbs |
| WO2013122675A3 (en) * | 2011-12-16 | 2013-11-28 | The Research Foundation For The State University Of New York | Methods of recognizing activity in video |
| CN103886588A (en) * | 2014-02-26 | 2014-06-25 | 浙江大学 | Feature extraction method of three-dimensional human body posture projection |
| CN103886588B (en) * | 2014-02-26 | 2016-08-17 | 浙江大学 | A kind of feature extracting method of 3 D human body attitude projection |
| US10783362B2 (en) | 2017-11-03 | 2020-09-22 | Alibaba Group Holding Limited | Method and apparatus for recognizing illegal behavior in unattended scenario |
| US10990813B2 (en) | 2017-11-03 | 2021-04-27 | Advanced New Technologies Co., Ltd. | Method and apparatus for recognizing illegal behavior in unattended scenario |
| CN112464734A (en) * | 2020-11-04 | 2021-03-09 | 昆明理工大学 | Vision-based quadruped animal walking motion characteristic automatic identification method |
| CN112464734B (en) * | 2020-11-04 | 2023-09-15 | 昆明理工大学 | Automatic identification method for walking motion characteristics of quadruped based on vision |
Also Published As
| Publication number | Publication date |
|---|---|
| EP2327057A4 (en) | 2017-11-22 |
| EP2327057A2 (en) | 2011-06-01 |
| WO2010036091A3 (en) | 2010-06-24 |
| CN102224526A (en) | 2011-10-19 |
| MY159289A (en) | 2016-12-30 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2010036091A2 (en) | A system and a method for identifying human behavioural intention based on an effective motion analysis | |
| Bouchrika et al. | On using gait in forensic biometrics | |
| US8639020B1 (en) | Method and system for modeling subjects from a depth map | |
| CN114067358A (en) | Human body posture recognition method and system based on key point detection technology | |
| US9235753B2 (en) | Extraction of skeletons from 3D maps | |
| Mahfouf et al. | Investigating the use of motion-based features from optical flow for gait recognition | |
| Mehrizi et al. | A Deep Neural Network-based method for estimation of 3D lifting motions | |
| CN108875507B (en) | Pedestrian tracking method, apparatus, system, and computer-readable storage medium | |
| WO2012023766A2 (en) | Security camera tracking and monitoring system and method using thermal image coordinates | |
| CN115376034A (en) | Motion video acquisition and editing method and device based on human body three-dimensional posture space-time correlation action recognition | |
| Dhulekar et al. | Motion estimation for human activity surveillance | |
| CN113196283A (en) | Attitude estimation using radio frequency signals | |
| El-Sallam et al. | A low cost 3D markerless system for the reconstruction of athletic techniques | |
| Liu et al. | Analysis of human walking posture using a wearable camera | |
| Zhao et al. | LiDAR-based human pose estimation with MotionBERT | |
| CN111062295A (en) | Area positioning method and device, and storage medium | |
| Bouchrika et al. | Recognizing people in non-intersecting camera views | |
| Decker et al. | An alternative approach to normalization and evaluation for gait patterns: Procrustes analysis applied to the cyclograms of sprinters and middle-distance runners | |
| Huang et al. | An Algorithm for Standing Long Jump Distance Measurement Based on Improved YOLOv11 and Lightweight Pose Estimation | |
| Goffredo et al. | Performance analysis for gait in camera networks | |
| CN119722751B (en) | Multi-camera multi-target tracking method and device | |
| Hachaj et al. | How Repetitive are Karate Kicks Performed by Skilled Practitioners? | |
| Krzeszowski et al. | The application of multiview human body tracking on the example of hurdle clearance | |
| Lok et al. | Model-based human motion analysis in monocular video | |
| Bouchrika et al. | Markerless extraction of gait features using haar-like template for view-invariant biometrics |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| WWE | Wipo information: entry into national phase |
Ref document number: 200980147053.3 Country of ref document: CN |
|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09816484 Country of ref document: EP Kind code of ref document: A2 |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 1972/DELNP/2011 Country of ref document: IN Ref document number: 2009816484 Country of ref document: EP |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |