WO2018170713A1 - Procédé et dispositif de commande de voiture robot basés sur la reconnaissance de geste - Google Patents
Procédé et dispositif de commande de voiture robot basés sur la reconnaissance de geste Download PDFInfo
- Publication number
- WO2018170713A1 WO2018170713A1 PCT/CN2017/077425 CN2017077425W WO2018170713A1 WO 2018170713 A1 WO2018170713 A1 WO 2018170713A1 CN 2017077425 W CN2017077425 W CN 2017077425W WO 2018170713 A1 WO2018170713 A1 WO 2018170713A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- palm
- gesture
- determining
- preset time
- starting position
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Definitions
- the invention is based on the field of image information processing and control of human-computer interaction, and particularly relates to a method and device for controlling a car based on gesture recognition.
- the image recognition technology of gestures can be roughly divided into static gesture recognition and dynamic gesture recognition.
- Static gesture recognition is characterized by learning the mapping relationship from the graphical feature space to the hand shape space. Specifically, firstly, a set of gestures is predefined, and a unique gesture feature description is extracted from each gesture, and then the image feature space is directly mapped to the gesture space to obtain an estimation of the gesture, that is, by image features (points, lines, An angle or texture area, etc.) to identify the gesture or gesture state.
- This method requires learning and training based on data templates representing various gestures.
- Commonly extracted gesture features include skin color features, Haar features, HOG features, shape features, and the like.
- the dynamic gesture recognition is mainly based on the motion detection algorithm, and the predefined dynamic gesture is detected according to the feature description of the motion region.
- dynamic gesture detection research is relatively mature, but if considering the interference factors such as illumination and background, the recognition rate is greatly reduced.
- the main object of the present invention is to provide a car control method based on gesture recognition, which aims to solve the problem of lowering the recognition rate caused by interference of ambient light, background and the like in the existing dynamic gesture detection.
- the present invention provides a car control method based on gesture recognition, the control method comprising the following steps:
- the cart motion is controlled according to the gesture control command.
- the determining the starting position of the palm includes:
- the center point of the region where the palm is located within the preset time is identified as the starting position of the palm.
- the determining the end position of the palm includes:
- the center point of the region where the palm is located in the second preset time is identified as the end position of the palm.
- the judging of the shaking of the palm is achieved by determining a change in the displacement of the palm according to the gradation value of the palm.
- the identifying the motion trajectory of the palm by the starting position and the ending position of the palm center specifically includes:
- a coordinate system is established based on the starting position of the palm, and the motion trajectory of the palm is recognized by determining the coordinates of the starting position and the ending position.
- the present invention further provides a car control device based on gesture recognition, and the car control device based on gesture recognition includes:
- a judging module configured to identify a motion trajectory of the palm based on the starting position and the ending position of the palm, and determine a gesture signal of the palm according to the motion trajectory of the palm;
- a command generating module configured to retrieve a gesture control command corresponding to the gesture signal according to the gesture signal
- a control module configured to control the movement of the trolley according to the gesture control command.
- the determining module is specifically configured to determine a position of the palm according to the joint point of the palm, and determine whether the range of the jitter of the palm is within a preset value within a preset time; when the jitter range of the palm is in the preset When the value is set, it is recognized as a valid gesture, and the center point of the region where the palm is located within the preset time is identified as the starting position of the palm.
- the determining module is configured to delay the first preset time after the initial position of the palm is recognized; and determine the second preset time after the first preset time is counted Whether the first jitter range of the palm is within the first preset value; when the first jitter range of the palm is within the first preset value, identifying the area where the palm is located in the second preset time The center point is the end position of the palm.
- the judgment of the shaking of the palm is performed by determining a change in the displacement of the palm according to the gradation value of the palm.
- the determining module is specifically configured to establish a coordinate system based on a starting position of the palm, and identify a motion trajectory of the palm by determining coordinates of the starting position and the ending position.
- the invention recognizes the start position and the end position of the palm, and determines the gesture signal of the palm according to the start position and the end position, and retrieves the gesture control command corresponding to the gesture signal based on the gesture signal, and finally according to the gesture
- the command controls the movement of the trolley, realizes the gesture of recognizing the palm of the hand in a simple and accurate manner to control the movement of the trolley, and solves the problem of the reduction of the recognition rate caused by the interference of the ambient light, the background and the like in the existing dynamic gesture detection, and the present invention
- By judging the jitter range based on the gray value continuous gesture recognition can be realized within a certain distance range.
- FIG. 1 is a schematic flow chart of a first embodiment of a method for controlling a car based on gesture recognition according to the present invention
- FIG. 2 is a schematic diagram of functional modules of a first embodiment of a vehicle control device based on gesture recognition according to the present invention.
- FIG. 1 is a flow chart showing a method of controlling a car based on gesture recognition according to a first embodiment of the present invention.
- the gesture recognition based car control method of the embodiment of the present invention includes the following steps:
- Step S10 determining a starting position and an ending position of the palm, and identifying a movement trajectory of the palm based on the starting position and the ending position of the palm, and determining a gesture signal of the palm according to the movement trajectory of the palm.
- the present invention recognizes a gesture by an image sensor such as Kinect, unlike the conventional method for analyzing gesture recognition by recognizing the image of the palm, the present invention only needs to recognize the start position and the end position of the palm, and according to the start. The position and end position can determine the gesture direction signal of the palm.
- the starting point of the palm is recognized by the image of the palm, and the starting point is the origin, and the plane rectangular coordinate system is established, and the end position of the ending point in the coordinate system is determined by analyzing the end point.
- the position, combined with the origin, can determine the current trajectory of the palm, and then draw the basic gesture of the palm according to the trajectory of the palm, including the palm to the left, to the right, up, down or clenched.
- Step S20 Acquire a gesture control command corresponding to the gesture signal based on the gesture signal.
- the car controller retrieves the gesture control command corresponding to the gesture signal through the memory, for example, by acquiring five kinds of gesture signals: the palm moves to the left, right, up, and down. Or the fist movements correspond to the five types of movements of the car's forward, backward, left turn, right turn, and stop, which can be easily obtained by means of a memory lookup table.
- step S30 the car motion is controlled according to the gesture control command.
- the car controller After the car controller obtains the gesture control command based on the gesture signal, the car controller controls the car to perform the corresponding action according to the control command.
- the two wheels of a specific car have two motor drive runs respectively, and the car can be driven to perform corresponding actions by controlling the direction of motion of the two motors.
- the gesture command controls the movement of the trolley, realizes the gesture of recognizing the palm of the hand in a simple and accurate manner to control the movement of the trolley, and solves the problem of the recognition rate reduction caused by the interference of ambient lighting, background and the like in the existing dynamic gesture detection.
- the present invention can realize continuous gesture recognition within 2 to 20 meters by judging the jitter range based on the gray value.
- the second embodiment of the gesture recognition-based car control method is based on the first embodiment of the above-described gesture recognition-based car control method of the present invention.
- the determining the starting position of the hand center specifically includes: The palm joint point determines the palm position, and determines whether the jitter range of the palm is within a preset value within a preset time; when the jitter range of the palm is within the preset value, the recognition is a valid gesture, and the recognition The center point of the jitter range of the palm is the starting position of the palm.
- the position of the palm of the present embodiment is specifically: identifying the joint point of the palm (which can be identified based on the ready-made library function), and determining the position of the palm (one point) according to the joint point of the palm.
- a gesture recognition processing that is, de-jitter processing, specifically, determining whether the jitter range of the palm is within a preset value within a preset time, if the preset time The jitter range of the palm is within the preset value and is recognized as a valid gesture.
- the jitter range does not exceed the preset value of 5 cm in a predetermined time of 0.5 seconds, and if so, it is recognized as a valid gesture, and the center point of the region where the palm is located within the preset time is identified as a palm. starting point.
- the third embodiment of the gesture recognition-based trolley control method is based on the first embodiment of the above-described gesture recognition-based trolley control method of the present invention.
- the determining the end position of the palm includes: After the initial position of the palm, delaying the first preset time; determining whether the first jitter range of the palm is in the first pre-predetermined time after the first preset time is counted When the first jitter range of the palm is within the first preset value, the center point of the region where the palm is located in the second preset time is identified as the end position of the palm.
- the third embodiment is different from the second embodiment in that after the initial position of the palm is recognized, a first preset time needs to be delayed, and no recognition processing is performed in the first preset time. After the first preset time is counted, it is determined whether the first jitter range of the palm is within the first preset value by the second preset time, and if it is within the first preset value, it is recognized as a valid gesture. For example, after the initial position of the palm is recognized, the first preset time is delayed by 0.5 seconds, and after 0.5 seconds, the first jitter range of the palm is judged to not exceed the first preset within 0.5 second of the second preset time.
- a value of 5 cm is shaken, and if so, the center point of the area where the palm is located within 0.5 of the second preset time is identified as the end position of the palm.
- the purpose of increasing the delay of the first preset time is to recognize the start position of the palm, and since the gesture of the subsequent palm is still in continuous motion, the action of whether the gesture ends or not can not be immediately followed, according to the gesture of different people.
- Select a suitable delay time for the speed such as 0.5 seconds. After 0.5 seconds, the jitter of the palm can be judged so that the end position of the palm can be judged accurately.
- the fourth embodiment of the gesture recognition-based trolley control method is based on the second or third embodiment of the above-described gesture recognition-based trolley control method of the present invention.
- the jitter of the palm is determined by the gray scale according to the palm. The value determines the change in the displacement of the palm.
- the depth information of the object can be acquired by the Kinect infrared sensor, when the preset motion of the user recognizes the area where the palm is located, the gray value parameter of the image of the area where the palm is located can be further recognized, that is, the depth information, that is, the palm and the Kinect sensor.
- the distance information can be determined according to the depth information of the user's palm position and the approximate distance of the Kinect sensor, such as different distance intervals, such as 2-3 meters, 3-4 meters, ..., 19- A distance range of 20 meters, according to which distance, a preset distance for recognizing the movement of the image in the palm screen when the palm is moved by a preset distance such as 5 cm can be converted.
- the preset gray value parameter that is, the depth information parameter
- the depth image of the gesture may be separated from the depth image of the user by the depth information of the area where the palm of the user is located.
- the information and further based on the depth information combined with the range of the palm's jitter in the screen, can identify the range value of the palm time jitter. For example, after the palm of the hand is determined by the palm joint (one point), the palm is judged to be 10 meters away from the lens according to the depth information of the palm. If the distance of the palm in the screen is within 0.1 cm within 0.5 s, the actual palm shake can be judged. Within 5cm, it is recognized as a valid gesture according to its actual palm jitter within the preset value.
- the fifth embodiment of the gesture recognition based car control method is based on the first embodiment of the above-described gesture recognition based car control method according to the present invention.
- the palm movement is recognized based on the starting position and the ending position of the palm.
- the trajectory specifically includes: establishing a coordinate system based on the starting position of the palm, and identifying a motion trajectory of the palm by determining coordinates of the starting position and the ending position.
- the coordinate system is established, the positive direction of the y-axis is downward, and the positive direction of the x-axis is to the right, and the difference between the horizontal and vertical coordinates and the ordinate of the end point and the starting point is calculated ( End point minus starting point), the trajectory of the palm is recognized by the difference between the ordinate and the ordinate of the end point and the starting point. If the ordinate difference is positive and the absolute value is greater than the absolute value of the abscissa difference, it is recognized as palm up.
- the ordinate difference is negative, and the absolute value is greater than the absolute value of the abscissa difference, which is recognized as the palm moving downward;
- the abscissa difference is positive, and the absolute value is greater than the absolute value of the ordinate coordinate, which is recognized as the palm moving to the right;
- the difference in the abscissa is negative, and the absolute value is greater than the absolute value of the ordinate difference, which is recognized as the palm moving to the left.
- the motion trajectory of the palm is recognized by determining the coordinates of the starting position and the ending position, in order to identify more accurately, when determining the starting position of the palm, it is necessary to delay one.
- the end position is recognized only after the first preset time is 0.5 s.
- the first preset time is The gesture is not finished at the end, and the jitter range of the palm is not detected within the first preset value, that is, the end position of the palm cannot be detected; but the second preset time after the first preset time.
- the end of the first preset time can immediately recognize that the range of the palm of the hand is within the first preset value, that is, the end position of the palm is recognized, This recognizes that the hand grasps the punching action, that is, whether or not the palm is a clenching action based on whether or not the end position of the palm is immediately recognized at the end of the first preset time.
- the invention also provides a trolley control device based on gesture recognition.
- FIG. 2 is a schematic diagram of functional modules of a first embodiment of a vehicle control device based on gesture recognition according to the present invention.
- the gesture recognition based car control device includes:
- Determining module 1 for determining a starting position and an ending position of the palm
- the determining module 2 is configured to identify a motion track of the palm based on the starting position and the ending position of the palm, and determine a gesture signal of the palm according to the motion track of the palm;
- the command generating module 3 is configured to retrieve a gesture control command corresponding to the gesture signal according to the gesture signal;
- the control module 4 is configured to control the movement of the trolley according to the gesture control command.
- the present invention recognizes a gesture by an image sensor such as Kinect, unlike the conventional method for analyzing gesture recognition by recognizing the image of the palm, the present invention only needs to recognize the start position and the end position of the palm, and according to the start. The position and end position can determine the gesture direction signal of the palm.
- the starting point of the palm is recognized by the image of the palm, and the starting point is the origin, and the plane rectangular coordinate system is established, and the end position of the ending point in the coordinate system is determined by analyzing the end point.
- the position, combined with the origin, can determine the current trajectory of the palm, and then draw the basic gesture of the palm according to the trajectory of the palm, including the palm to the left, to the right, up, down or clenched.
- the car controller retrieves the gesture control command corresponding to the gesture signal through the memory, for example, by acquiring five kinds of gesture signals: the palm moves to the left, right, up, and down. Or the fist movements correspond to the five types of movements of the car's forward, backward, left turn, right turn, and stop, which can be easily obtained by means of a memory lookup table.
- the car controller After the car controller obtains the gesture control command based on the gesture signal, the car controller controls the car to perform the corresponding action according to the control command.
- the two wheels of a specific car have two motor drive runs respectively, and the car can be driven to perform corresponding actions by controlling the direction of motion of the two motors.
- the gesture command controls the movement of the trolley, realizes the gesture of recognizing the palm of the hand in a simple and accurate manner to control the movement of the trolley, and solves the problem of the recognition rate reduction caused by the interference of ambient lighting, background and the like in the existing dynamic gesture detection.
- the present invention can realize continuous gesture recognition within 2 to 20 meters by judging the jitter range based on the gray value.
- the second embodiment of the gesture recognition-based car control device is based on the first embodiment of the above-described gesture recognition-based car control device according to the present invention.
- the determining module is specifically configured to be based on the palm joint point. Determining a position of the palm, and determining whether the range of the palm of the hand is within a preset value within a preset time; and identifying the effective gesture when the range of the jitter of the palm is within the preset value, identifying the pre- Set the center point of the area where the palm is located to the starting position of the palm.
- the determining module 1 recognizes that the joint point of the palm can be identified based on the ready-made library function, and the position of the palm (one point) can be determined according to the joint point of the palm. Before the starting position of the palm is recognized, whether It is effective gesture recognition processing, that is, de-jitter processing, specifically determining whether the jitter range of the palm is within a preset value within a preset time. If the jitter range of the palm is within a preset value within a preset time, it is recognized as Effective gestures.
- de-jitter processing specifically determining whether the jitter range of the palm is within a preset value within a preset time. If the jitter range of the palm is within a preset value within a preset time, it is recognized as Effective gestures.
- the jitter range does not exceed the preset value of 5 cm in a predetermined time of 0.5 seconds, and if so, it is recognized as a valid gesture, and the center point of the region where the palm is located within the preset time is identified as a palm. starting point.
- the third embodiment of the gesture recognition-based car control device is based on the first embodiment of the above-described gesture recognition-based car control device according to the present invention.
- the determining module is specifically configured to recognize the palm of the hand. After the initial position, delaying the first preset time; determining whether the first jitter range of the palm is at the first preset value within a second preset time after the first preset time is counted When the first jitter range of the palm is within the first preset value, the center point of the region where the palm is located in the second preset time is identified as the end position of the palm.
- the third embodiment is different from the second embodiment in that after the initial position of the palm is recognized, a first preset time needs to be delayed, and no recognition processing is performed in the first preset time. After the first preset time is counted, it is determined whether the first jitter range of the palm is within the first preset value by the second preset time, and if it is within the first preset value, it is recognized as a valid gesture. For example, after the initial position of the palm is recognized, the first preset time is delayed by 0.5 seconds, and after 0.5 seconds, the first jitter range of the palm is judged to not exceed the first preset within 0.5 second of the second preset time.
- a value of 5 cm is shaken, and if so, the center point of the area where the palm is located within 0.5 of the second preset time is identified as the end position of the palm.
- the purpose of increasing the delay of the first preset time is to recognize the start position of the palm, and since the gesture of the subsequent palm is still in continuous motion, the action of whether the gesture ends or not can not be immediately followed, according to the gesture of different people.
- Select a suitable delay time for the speed such as 0.5 seconds. After 0.5 seconds, the jitter of the palm can be judged so that the end position of the palm can be judged accurately.
- the fourth embodiment of the gesture recognition-based car control device is based on the second or third embodiment of the above-described gesture recognition-based car control device of the present invention.
- the jitter of the palm is judged by the gray scale according to the palm. The value determines the change in the displacement of the palm.
- the depth information of the object can be acquired by the Kinect infrared sensor, when the preset motion of the user recognizes the area where the palm is located, the gray value parameter of the image of the area where the palm is located can be further recognized, that is, the depth information, that is, the palm and the Kinect sensor.
- the distance information can be determined according to the depth information of the user's palm position and the approximate distance of the Kinect sensor, such as different distance intervals, such as 2-3 meters, 3-4 meters, ..., 19- A distance range of 20 meters, according to which distance, a preset distance for recognizing the movement of the image in the palm screen when the palm is moved by a preset distance such as 5 cm can be converted.
- the preset gray value parameter that is, the depth information parameter
- the depth image of the gesture may be separated from the depth image of the user by the depth information of the area where the palm of the user is located.
- the information and further based on the depth information combined with the range of the palm's jitter in the screen, can identify the range value of the palm time jitter. For example, after the palm of the hand is determined by the palm joint (one point), the palm is judged to be 10 meters away from the lens according to the depth information of the palm. If the distance of the palm in the screen is within 0.1 cm within 0.5 s, the actual palm shake can be judged. Within 5cm, it is recognized as a valid gesture according to its actual palm jitter within the preset value.
- the fifth embodiment of the gesture recognition-based car control device is based on the first embodiment of the above-described gesture recognition-based car control device according to the present invention.
- the palm movement is recognized based on the starting position and the ending position of the palm.
- the track details include:
- a coordinate system is established based on the starting position of the palm, and the motion trajectory of the palm is recognized by determining the coordinates of the starting position and the ending position.
- the coordinate system is established, the positive direction of the y-axis is downward, and the positive direction of the x-axis is to the right, and the difference between the horizontal and vertical coordinates and the ordinate of the end point and the starting point is calculated ( End point minus starting point), the trajectory of the palm is recognized by the difference between the ordinate and the ordinate of the end point and the starting point. If the ordinate difference is positive and the absolute value is greater than the absolute value of the abscissa difference, it is recognized as palm up.
- the ordinate difference is negative, and the absolute value is greater than the absolute value of the abscissa difference, which is recognized as the palm moving downward;
- the abscissa difference is positive, and the absolute value is greater than the absolute value of the ordinate coordinate, which is recognized as the palm moving to the right;
- the difference in the abscissa is negative, and the absolute value is greater than the absolute value of the ordinate difference, which is recognized as the palm moving to the left.
- the motion trajectory of the palm is recognized by determining the coordinates of the starting position and the ending position, in order to identify more accurately, when determining the starting position of the palm, it is necessary to delay one.
- the end position is recognized only after the first preset time is 0.5 s.
- the first preset time is The gesture is not finished at the end, and the jitter range of the palm is not detected within the first preset value, that is, the end position of the palm cannot be detected; but the second preset time after the first preset time.
- the end of the first preset time can immediately recognize that the range of the palm of the hand is within the first preset value, that is, the end position of the palm is recognized, This recognizes that the hand grasps the punching action, that is, whether or not the palm is a clenching action based on whether or not the end position of the palm is immediately recognized at the end of the first preset time.
- each module of the device can be specifically set as follows:
- the car control system detects the image information of the palm of the user through the Kinect sensor and can further detect the image gray value information of the position of the palm, and further recognizes the starting position and the ending position of the palm, so the determining module 1 is set in the Kinect sensor, Kinect After the sensor recognizes the starting position and the ending position of the palm, the position information is sent to the terminal device such as the PC end, and the Kinect sensor can be connected to the terminal device by wire, and the terminal device judges the gesture of the palm according to the starting position and the ending position.
- the judging module 2 is set in the terminal device, or the Kinect sensor can directly determine the gesture signal and send it to the terminal device based on the initial position and the end position analysis of the palm, and the judging module 2 is also set in the Kinect sensor.
- the device retrieves the gesture control command corresponding to the gesture signal according to the gesture signal, so the command generation module 3 is set in the terminal device, and finally the terminal device sends the gesture control command to the controller on the trolley by wireless transmission, such as Bluetooth, the car Controller according to hand Car operation control command, the control module 4 is provided on the carriage.
Landscapes
- User Interface Of Digital Computer (AREA)
- Image Analysis (AREA)
Abstract
L'invention concerne un procédé de commande de voiture robot basé sur la reconnaissance de geste, comprenant les étapes consistant à : déterminer une position de départ et une position de fin d'une paume, reconnaître la trajectoire de mouvement de la paume sur la base de la position de départ et de la position de fin de la paume, et déterminer un signal de geste de la paume en fonction de la trajectoire de mouvement de la paume (S10); invoquer une instruction de commande de geste correspondant au signal de geste sur la base du signal de geste (S20); et commander le mouvement d'une voiture robot selon l'instruction de commande de geste (S30). Selon le procédé, un geste de la paume est reconnu simplement et avec précision pour commander le mouvement d'une voiture robot, et le problème de faible taux de reconnaissance provoqué facilement par l'interférence de facteurs tels que l'éclairage et l'arrière-plan de l'environnement pendant la détection de geste dynamique existante est résolu.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/CN2017/077425 WO2018170713A1 (fr) | 2017-03-21 | 2017-03-21 | Procédé et dispositif de commande de voiture robot basés sur la reconnaissance de geste |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/CN2017/077425 WO2018170713A1 (fr) | 2017-03-21 | 2017-03-21 | Procédé et dispositif de commande de voiture robot basés sur la reconnaissance de geste |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2018170713A1 true WO2018170713A1 (fr) | 2018-09-27 |
Family
ID=63584080
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/CN2017/077425 Ceased WO2018170713A1 (fr) | 2017-03-21 | 2017-03-21 | Procédé et dispositif de commande de voiture robot basés sur la reconnaissance de geste |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2018170713A1 (fr) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN112926454A (zh) * | 2021-02-26 | 2021-06-08 | 重庆长安汽车股份有限公司 | 一种动态手势识别方法 |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN102193626A (zh) * | 2010-03-15 | 2011-09-21 | 欧姆龙株式会社 | 手势识认装置、手势识认装置的控制方法、以及控制程序 |
| CN103390168A (zh) * | 2013-07-18 | 2013-11-13 | 重庆邮电大学 | 基于Kinect深度信息的智能轮椅动态手势识别方法 |
| JP2014085963A (ja) * | 2012-10-25 | 2014-05-12 | Nec Personal Computers Ltd | 情報処理装置、情報処理方法、及びプログラム |
| CN103869974A (zh) * | 2012-12-18 | 2014-06-18 | 现代自动车株式会社 | 用于手势有效部分检测的系统和方法 |
| CN103941866A (zh) * | 2014-04-08 | 2014-07-23 | 河海大学常州校区 | 一种基于Kinect深度图像的三维手势识别方法 |
-
2017
- 2017-03-21 WO PCT/CN2017/077425 patent/WO2018170713A1/fr not_active Ceased
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN102193626A (zh) * | 2010-03-15 | 2011-09-21 | 欧姆龙株式会社 | 手势识认装置、手势识认装置的控制方法、以及控制程序 |
| JP2014085963A (ja) * | 2012-10-25 | 2014-05-12 | Nec Personal Computers Ltd | 情報処理装置、情報処理方法、及びプログラム |
| CN103869974A (zh) * | 2012-12-18 | 2014-06-18 | 现代自动车株式会社 | 用于手势有效部分检测的系统和方法 |
| CN103390168A (zh) * | 2013-07-18 | 2013-11-13 | 重庆邮电大学 | 基于Kinect深度信息的智能轮椅动态手势识别方法 |
| CN103941866A (zh) * | 2014-04-08 | 2014-07-23 | 河海大学常州校区 | 一种基于Kinect深度图像的三维手势识别方法 |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN112926454A (zh) * | 2021-02-26 | 2021-06-08 | 重庆长安汽车股份有限公司 | 一种动态手势识别方法 |
| CN112926454B (zh) * | 2021-02-26 | 2023-01-06 | 重庆长安汽车股份有限公司 | 一种动态手势识别方法 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| AU2014297039B2 (en) | Auto-cleaning system, cleaning robot and method of controlling the cleaning robot | |
| WO2019024337A1 (fr) | Procédé de reconnaissance de geste dans l'air basé sur l'induction photoélectrique, et appareil de commande associé | |
| WO2016104909A1 (fr) | Commande d'un véhicule | |
| WO2021194254A1 (fr) | Dispositif électronique pour afficher une image à l'aide d'un afficheur côté système de surveillance par caméra (cms) monté dans un véhicule et son procédé de fonctionnement | |
| CN106933343B (zh) | 用于识别虚拟现实头戴装置中的手势的设备和方法 | |
| WO2018088806A1 (fr) | Appareil de traitement d'image et procédé de traitement d'image | |
| US11105141B2 (en) | Control device and program | |
| WO2017008224A1 (fr) | Procédé de détection de distance à un objet mobile, dispositif et aéronef | |
| WO2016192438A1 (fr) | Procédé d'activation de système d'interaction de détection de mouvement, et procédé et système d'interaction de détection de mouvement | |
| JP2018005881A (ja) | 顔検出追跡方法、ロボットヘッドの回動制御方法、及びロボット | |
| WO2020046038A1 (fr) | Robot et procédé de commande associé | |
| WO2016085212A1 (fr) | Dispositif électronique et procédé de commande d'affichage | |
| WO2015100911A1 (fr) | Dispositif médical, et procédé et appareil pour régler la résolution d'une image numérique associée | |
| JP2017510875A (ja) | ジェスチャー装置、その動作方法及びこれを備えた車両 | |
| WO2020145688A1 (fr) | Dispositif électronique et procédé de commande associé | |
| WO2021031334A1 (fr) | Système de climatisation, et procédé de commande de climatiseur et appareil de commande de climatiseur associés | |
| WO2015106503A1 (fr) | Carte mère incorporée pour système de commande de parc de stationnement, et système de commande de parc de stationnement | |
| WO2022035054A1 (fr) | Robot et son procédé de commande | |
| WO2018170713A1 (fr) | Procédé et dispositif de commande de voiture robot basés sur la reconnaissance de geste | |
| CN103472907A (zh) | 操作区的决定方法与系统 | |
| WO2017067289A1 (fr) | Procédé de reconnaissance d'empreintes digitales, appareil et terminal mobile | |
| WO2020204355A1 (fr) | Dispositif électronique et son procédé de commande | |
| WO2019114565A1 (fr) | Procédé et dispositif d'ajustement de climatiseur et support de stockage | |
| WO2015167081A1 (fr) | Procédé et dispositif permettant de détecter une partie d'un corps humain | |
| WO2019242086A1 (fr) | Procédé, système, dispositif et support de stockage interactifs pour un modèle tridimensionnel |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17901760 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 17901760 Country of ref document: EP Kind code of ref document: A1 |