WO2012156159A1 - Procédé d'évaluation pour une suite d'images en profondeur, en succession temporelle - Google Patents
Procédé d'évaluation pour une suite d'images en profondeur, en succession temporelle Download PDFInfo
- Publication number
- WO2012156159A1 WO2012156159A1 PCT/EP2012/056764 EP2012056764W WO2012156159A1 WO 2012156159 A1 WO2012156159 A1 WO 2012156159A1 EP 2012056764 W EP2012056764 W EP 2012056764W WO 2012156159 A1 WO2012156159 A1 WO 2012156159A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- person
- determined
- evaluation method
- point
- offset vector
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/64—Three-dimensional objects
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/28—Recognition of hand or arm movements, e.g. recognition of deaf sign language
Definitions
- the present invention relates to an evaluation method for a sequence of temporally successive depth images, each showing at least one person from the front.
- the present invention further relates to a Computerpro ⁇ program comprising machine code that can be executed directly by a computer and whose execution caused by the computer that the computer executes such an evaluation information model.
- the present invention further relates to a computer that is programmed so that it performs a derarti ⁇ ges evaluation during operation.
- the present invention further relates to a Auswer ⁇ processing device,
- the evaluation device has a camera device for detecting a sequence of temporally successive depth images
- the evaluation device comprises a computer which is connected to the camera device for receiving and evaluating the captured images from the camera device data.
- Gesture control is known as such.
- An overview of the state of the art can be found in the article "Gesture Recognition: A Survey” by Sushmita Mitra and Tinku Acharya, IEEE Transactions on Systems, Man, and Cybernetics, Part C, Applications and Reviews, Vol. 37, No. 3 , May 2007, pages 311 to 324.
- the gesture control is alternatively non-contact (usually by image analysis of camera images) or with the need for special clothing or markings.
- special gloves or controllers must be worn in or on the hand.
- the product "Kinect” from Microsoft has also recently been introduced to the market, providing a sensor plus evaluation software that can inexpensively realize non-contact control, but the evaluation methods used there initially require a special calibration gesture from the operator Object of the present invention is to provide ways by which a simple and reliable way a non-contact control by appropriate Hand and arm movements is possible. In particular, a previous calibration should not be required.
- spiegelsym ⁇ metric extends to a vector containing the offset vertical plane and each point of the interface from the base ⁇ point is spaced by at least the length of the offset vector
- each Einzelarm Scheme is determined by those points in the three-dimensional space having at least ei ⁇ nem another point of the same Einzelarmrioss a distance which is smaller than a minimum distance, and at all other points of the common arm area have a distance greater than a Maximum distance is, - That the minimum distance is approximately equal to a grid dimension of the three-dimensional space and the maximum distance is greater than the grid and
- the further evaluation of the Einzelarm Schemee for the respective Einzelarmbe ⁇ rich comprise a based on a statistical evaluation of the respective Einzelarm Maschinentube GmbH &Teller GmbH &Teller GmbH &Teller GmbH &Teller GmbH &Teller GmbH &Teller GmbH &Teller GmbH &Teller GmbH &Teller GmbH &Teller GmbH &Teller GmbH &Teller GmbH &Teller GmbH &Teller GmbH &Teller GmbH ⁇ - a hand position for the respective Einzelarm Bachelor be determined.
- a dimension of that part of the respective individual arm region which lies within a hand region surrounding the respective hand position to be determined and for an action to be activated if the determined dimension has a minimum dimension during a predetermined number of successive depth images exceeds.
- a starting point is determined, which lies in the horizontal direction between the base point and the intersection. Then, an intersection point of a connection line determined by the starting point and the hand position of the respective individual arm area can be determined with a predetermined working level extending in front of the person. Alternatively, it is possible for an intersection of the longest of the main axes of the respective individual arm region to be determined with a predetermined working plane extending in front of the person.
- the image analysis usually gives better results when the depth images are filtered properly.
- filtering it is particularly appropriate that as part of the determination of the common arm area and / or as part of the determination of the individual arm areas disturbance-reducing morphological operations are performed.
- the interface that (in effect) separates the person's torso from their arms may be determined as needed.
- the interface is inflection point free.
- the interface may be a plane or a ge to the base point ⁇ curved surface. It is possible that an evaluation of the pixels of the depth images (including the associated depth information) takes place directly.
- the area of person-like contour is imaged as a three-dimensional point cloud in the three-dimensional space, and that the evaluations of the area of person-like contour take place at least partially in three-dimensional space.
- a regression plane to the person's chest is determined based on the area of person-like contour and that the offset vector is determined such that it is oriented orthogonal to a horizontal line running within the regression plane .
- the offset vector is determined such that it is oriented orthogonal to a horizontal line running within the regression plane .
- at least the direction of the offset vector is fixed. This approach is easier to implement and in many cases leads to sufficiently good result ⁇ sen.
- the altitude of the base point is given fixed ⁇ before.
- the base point is determined by the center of gravity of the area of person-like contour.
- the evaluation method according to the invention can be optimized in that the depth images and / or the results determined on the basis of the depth images are time-low-pass filtered.
- the object is further achieved by a computer program having the features of claim 19. According to the invention, the processing of the machine code by the computer causes the computer to carry out an evaluation method according to the invention.
- the object is further achieved by a computer having the features of claim 20.
- the computing ⁇ ner is programmed such that it performs during operation a erfindungsge ⁇ zeßes evaluation method.
- an evaluation device having the features of claim 21.
- an evaluation device of the type mentioned above is equipped with a computer according to the invention.
- a sequence of images B which are consecutive in time, is detected by means of a camera device 1, for example a CCD camera.
- a camera device for example a CCD camera.
- images B may, for example, by means of a projec ⁇ ons Rhein 2 a pattern - for example, a sinusoidally modulated fringe pattern - are projected in a detected by the camera device 1 image region.
- the person 4 of projected pattern includes the images B for depth information For each pixel can therefore be determined which distance the corresponding point of the detected object (for example, the person 4) from the camera device
- Such images B are generally referred to as depth images and are widely used in the prior art for measuring three-dimensional objects, for example during material testing.
- the image area 3 can be scanned by means of a laser and the transit time of the laser beam can be determined for each pixel.
- the depth images are fed to a computer 5, which is at least connected with the camera device 1-possibly also with the projection device 2-in terms of data technology.
- the computer 5 receives the depth images, prepares them and evaluates them.
- the computer 5 is programmed with a computer program 6.
- the computer program 6 can be supplied to the computer 5, for example via a data carrier 7, on which the computer program 6 is stored in machine-readable form, generally in an exclusively machine-readable form.
- a USB memory stick is shown in FIGS. 1 and 2 as data carrier 7.
- the computer program 6 comprises machine code 8, which can be processed directly by the computer 5.
- the execution of the machine code 8 by the computer 5 causes the computer 5 to carry out an evaluation method which will be explained in more detail below in conjunction with FIG. 3 and the further FIG.
- the computer 5 accepts a sequence of images B in a step S1.
- a step S2 takes the
- Calculator 5 if necessary, a pre-evaluation of the images B before.
- the computer 5 can determine the depth information in the images B in a manner known per se and assign them to the pixels of the images B.
- the resulting images are hereinafter referred to as depth images and provided to distinguish them from the originally captured images B with the reference symbol B '.
- a step S3 the computer 5 determines in the depth images B 'in each case at least one region 9 of a person-like contour.
- the calculations required to determine the area 9 of a person-like contour are known to those skilled in the art.
- the area 9 of a person-like contour can be determined by determining a consistent movement in a plurality of depth images B '. In particular, in this case takes place in the context of step S3, a comparison of several temporally successive depth images B '.
- the computer 5 assigns the respectively determined area 9 of a person-like contour of the person 4.
- a priori is known (or is assumed) that in the image area 3 of the camera device 1 only a single person 4 resides.
- a single area 9 of a person-like contour is determined per depth image B '.
- the computer 5 determines - a assuming speaking power of the computer program 6 - several areas 9 person-like contour and assigns them each one of the persons 4.
- ⁇ delt only the case that in each case a single Be ⁇ 9 persons of similar contour is' rich determined in each depth image B and this area is assigned to 9 persons of similar contour of the (single) person.
- the step S3 is executed by the computer 5 in general across depths. In order for a particular depth image B 'to determine 9 persons of similar contour to the area, is not only the particular depth image B from the calculator 5' evaluated, but is the respective depth image B 'and at least the immediately preceding and / or the UNMIT ⁇ telbar subsequent Depth picture B 'considered.
- the further steps S4 to S9 of FIG. 3, however, are respectively related to one of the depth images B '. In the context of steps S4 to S9, therefore, the depth images B 'can be considered independently of one another. Of course, however, the steps S4 to S9 can be carried out for several depth images B ', in particular for all depth images B'.
- step S5 the computer 5 determines a base point 10, 10 'of the area 9 person-like contour in three-dimensional space.
- the computer 5 - see FIG. 4 - can determine the mean value of the location coordinates of the area 9 of a person-like contour in three-dimensional space (ie the center of gravity 10 of the area 9 of a person-like contour) and define it as the base point.
- the corresponding base point is shown in FIG 2 by the reference numeral 10 ver ⁇ .
- the computer 5 - see FIG 5 - in a step S26 the mean value of the location coordinates of the area.
- the computer 5 can define the base point 10 'in that it uses the mean values determined in step S26 as the horizontal coordinates and a value approximately at the breast height of the person 4 as the vertical coordinate.
- the computer 5 can determine the maximum height of the area 9 of person-like contour, ie the peak height of the person 4.
- the computer 5 can in step S28 a suitable percentage - for example 70% to 85% - of this value as Vertikalkoordinate use.
- a base point determined in this way is also shown in FIG. 2 and provided with the reference numeral 10 '.
- step S6 the computer 5 determines an offset vector 11, 11 '.
- the offset vector 11, 11 ' starts from the base point 10, 10'. It is horizontally directed and has a component that points forward with respect to the person 4.
- the length ⁇ ge of the offset vector 11, 11 ' is determined such that the basis of the base point 10, 10' outgoing offset vector 11, 11 'in front of the chest of the person 4, but within the arm reach of the person 4 ends.
- FIG. 7 shows-purely by way of example-the regression plane 12 and the corresponding offset vector, which is provided there with the reference numeral 11.
- the determination of the regression plane 12 is known to those skilled in the art.
- the regression plane 12 of the computer 5 shown in FIG 6 further determines in a step S32, a lying within the regression plane 12 horizontal line.
- the computer 5 determines in this case the offset vector 11 such that it extends horizontally and is orthogonal directed to the ge ⁇ called horizontal line.
- the computer 5 in step S33 form the vector product of horizontal line and vertical direction.
- Direction should always be directed normal to a (vertical) working plane 13, for example, orthogonal to a Ar ⁇ beitsebene 13 defining vertically erected display device.
- This offset vector is provided in FIG. 7 with the reference numeral 11 '.
- the working plane 13 lies in front of the person 4.
- the manner in which the computer 5 determines the offset vector 11, 11 ' is independent of the manner in which the computer 5 determines the base point 10, 10'.
- step S7 the computer 5 determines an interface 14 of the three-dimensional space.
- the interface 14 contains the offset vector 11, 11 '(starting from the base point 10, 10').
- the interface 14 extends aptsymmet ⁇ driven to a vertical plane containing the (from the base point 10, 10 'outgoing) offset vector 11, 11'.
- the location pointed to by the offset vector 11, 11 ' is that location of the interface 14 which has the smallest distance from the base point 10, 10'. All other locations of Grenzflä ⁇ che 14 have a distance from the base point 10, 10 ', which is at least as large as - usually even greater than - the length of the offset vector 11, 11'.
- the interface 14 is free of inflection. Insbeson ⁇ particular, the interface 14 is a plane or a curved around the Ba ⁇ sisyak 10, 10 'face to be.
- the interface 14 may be defined according to the 1 and 2 as a ball around a ball center 15, wherein the Kugelsch ⁇ point 15 is in relation to the person 4 behind the person 4.
- the interface 14 can be used as a cylinder jacket around a vertically be defined current axis. It is also possible that the curvature in the horizontal direction and the curvature in the vertical direction are different from each other. In this case, in general, the curvature in the horizontal direction of the radius of curvature in the horizontal direction is stronger than the curvature in the vertical direction, that is smaller than the radius of curvature in the vertical direction ⁇ .
- step S8 the computer 5 determines individual arm areas 16, 17 within the common arm area.
- Each individual arm area 16, 17 is determined by those points in three-dimensional space which satisfy the following conditions:
- Each point of a particular Einzelarm Schemes 16, 17 (at ⁇ example, the point 18 in FIG 1) has at least one other point of the same single arm 16, 17 (for ⁇ example, the point 19) at a distance which is smaller than a minimum distance.
- the same point (according to example the point 18) has to all other points of the common arm region (for example, to the point 20) at a distance which is greater than a Maxi ⁇ distance.
- the determination of the individual arm regions 16, 17 is known to those skilled in the art as such.
- the three-dimensional space is usually divided into a grid.
- the minimum distance in this case is approximately equal to the grid dimension. In particular, the minimum distance should be at most about twice the grid size.
- the maximum distance is greater than the grid dimension and also greater than the minimum distance.
- the Ver ⁇ ratio of maximum distance and minimum distance typically lies between 1.2 and 2.0, for example at 1.4 to 1.6.
- step S8 it is possible for step S8 to be followed by a step S36 and, individually or in combination, steps S37 and S38.
- step S36 the respective number of points that is contained in the respective individual arm area 16, 17 is determined for all individual arm areas 16, 17.
- step S37 only those individual arm areas 16, 17 whose number of points exceeds a minimum number of points are defined as valid individual arm areas 16, 17. All other Einzelarm Schemee 16, 17 are ver ⁇ worfen.
- step S38 only those Einzelarm Schemee 16, 17, which have the largest and the second largest number of points as a valid Einzelarm Schemee 16, 17 defi ned ⁇ . All other individual arm areas 16, 17 are verwor ⁇ fen.
- step S9 the computer 5 uses the individual arm areas 16, 17 to carry out further evaluations of the depth images B '. Possible evaluations are explained in more detail below in conjunction with FIGS. 9 and 10.
- step S10 the computer 5 checks whether it should terminate the evaluation method according to the invention. If this is not the case, the computer 5 proceeds to a step Sil, in which he from the camera device 1 another image B accepts. The computer 5 then proceeds again to step S2.
- This type of evaluation is known per se to experts.
- a step S42 the computer 5 selects the longest of the determined main axes, in FIG. 7 for the single-arm region 17 with the reference numeral 21.
- the computer 5 may be based on the center of gravity of the respective Einzelarm Maschinens 16, 17 and the longitudinal ⁇ th main axis 21 of the respective Einzelarm Maschinens 16, 17 precisely determine ei ⁇ ne.
- the computer 5 can determine an intersection 22 of the said straight line with the working plane 13. The intersection 22 corresponds to one
- step S45 the computer 5 can be based on the center of gravity of the respective individual arm region 16, 17 and the longest main axis 21 of the respective individual arm region 16, 17 in conjunction with the offset vector 11, 11 '(this is for determining the Direction required, starting from the center of gravity of the respective individual arm area 16, 17)
- a connection line 24 to be ⁇ agree that connects the base point 10, 10 'and the respective Handpo ⁇ sition 23 together.
- the point of intersection 25 of the connecting line 24 with the working plane 13 can be determined.
- this section ⁇ point 25 corresponds to a position of the working plane 13, to which the person 4 (approximately) indicates.
- steps S44 and S46 are present.
- Step S45 provide further steps S47 to S54.
- the computer 5 determines a hand area 26, ie that part of the corresponding individual arm area 16, 17, which surrounds the hand position 23. For example, a ball with a predetermined radius can be defined around the determined hand position 23. In this case, it is possible that only those points of the respective Einzelarm Schemes 16, 17 are further utilized, which are within the respective manual con ⁇ Reich 26th For example, the computer 5 in
- Step S48 a ⁇ (maximum) dimension of the hand area 26 he ⁇ average.
- the computer 5 performs the corresponding Determined ⁇ development in three dimensional space.
- the computer may be 5 points in a step S49 - in particular with ⁇ means of a parallel projection, alternatively possibly by means of a perspective projection - in an evaluation plane 27 (see FIG 7) project, and there make the corresponding ⁇ For evaluation.
- the evaluation level 27 can, for example, wise - orthogonal to the offset vector 11, 11 'extend or coincide with the working plane 13.
- the computer 5 compares the determined dimension in step S50 with a minimum dimension. If the determined dimension is smaller than the predetermined minimum dimension (plaintext: the person 4 has not spread his hand), the computer 5 resets a counter in step S51. Otherwise, the computer 5 increments the counter in step S52.
- step S53 the computer 5 checks the value of the counter.
- the value of the counter exceeds a predetermined limit (for example, the value 3, the value 5, the value of 10 or egg ⁇ NEN other suitable value), the dimension determined is thus during a time determined by the limit number of consecutive depth images B 'is always greater than the minimum dimension (plain text, the person 4, the hand ge ⁇ spreads), is performed by the computer 5 at step S54 an Ak ⁇ tivtechnik an action, that is, it interprets the value counting the barrier as "click command" a virtual mouse.
- a predetermined limit for example, the value 3, the value 5, the value of 10 or egg ⁇ NEN other suitable value
- steps S61 and S62 can be arranged downstream of step S43.
- step S61 based on the center of gravity of the respective individual arm region 16, 17, the point of intersection 28 of the longest main axis 21 with a base plane 29 is determined (see FIG. 7).
- the base plane 29 contains the base point 10, 10 'and extends orthogonally to the offset vector 11, 11'.
- step S62 depending on whether the point of intersection 28 is on the left or the right of the base point 10, 10 ', the respective individual arm area 16, 17 is assigned to the left or the right arm of the person 4. In this way, for example, a "click" with the left hand in the context of step S54 trigger a different action than a "Kli ⁇ ck" with the right hand. If the corresponding assignment of the individual-arm regions 16, 17 to the corresponding arms of the person 4 takes place, according to FIG. 10-as an alternative to the steps S44 or S46 of FIG. 9-steps S63 and S64 can continue to be present. in the
- Step S63 determines the computer 5 in the base plane 29 at ⁇ hand of the base point 10, 10 'and the point of intersection 28 a starting point 30.
- the starting point 30 is determined in such a way that, seen in the horizontal direction within the base layer 29 between the base point 10, 10' and the intersection point 28 lies.
- An offset of the starting point 30 relative to the base point 10, 10 ' is preferably determined such that the starting point 30 is approximately in the region of the corresponding shoulder joint of the person 4.
- a connecting line be true ⁇ 31, connecting the start point 30 and the respective hand position 23 with each other - in the step S64 in this case - similar to step S46 of FIG. 9 Then may - also in step S64 - the point of intersection 32 of the connecting line are determined 31 with the Ar ⁇ beitsebene.
- this intersection 32 corresponds to a position of the working plane 13, to which the person 4 (in approximately) indicates.
- steps S63 and S64 steps S44 and S46 of FIG. 9 are preferably not implemented.
- the above-described evaluation method and its embodiments can be further verbes ⁇ sert in different ways.
- noise-reducing approach have ⁇ which the evaluation robust overall Stalten, an advantage of the invention.
- the depth images B 'and / or - directly or indirectly - based on the depth images B' determined results (for example, the Einzelarmberei ⁇ che 16, 17) are time-low-pass filtered for this purpose.
- disturbance-reducing morphological operations can be carried out within the depth images B 'within the framework of determining the common arm region and / or within the scope of determining the individual arm regions 16, 17.
- Corresponding morphological operations are specialist people are well known.
- the inverse procedure is also possible.
- the second step is executed.
- the first step is carried out.
- the corresponding operations are also applicable to the Einzelarm Schemee 16, 17. Other morphological operations are possible.
- the ent ⁇ speaking operations are known in the art.
- the depth images B ' are converted into three-dimensional space. It is thus - possibly before the determination of the area 9 person-like contour, possibly also after its determination - an image of the respective depth image B 'or the area 9 person-like contour as a three-dimensional point cloud in the three-dimensional space.
- This image is readily possible with known position and orientation of the camera device 1 and known imaging parameters of the camera device 1.
- the subsequent evaluation can in this case take place as required in the depth images B 'or directly in three-dimensional space.
- at least the activation of actions is hysteresis.
- the present invention has many advantages. For example, in contrast to known evaluation methods, no special devices or markings on the person 4 are required. Furthermore, no training phase of the computer 5 is required.
- the evaluation method according to the invention therefore enables a simple, yet reliable and accurate evaluation of the area 9 of a person-like contour.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Psychiatry (AREA)
- Social Psychology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Analysis (AREA)
Abstract
Une suite d'images en profondeur (B') montre, respectivement, au moins une personne (4) vue de face. Dans les images en profondeur (B'), au moins une zone (9) d'un contour similaire à la personne, est déterminée par intervalles, et associée à la personne. Un point de base (10, 10') de la zone (9) de contour similaire à la personne est déterminé dans l'espace, et un vecteur latéral (11, 11') est déterminé. Le vecteur latéral (11, 11') partant du point de base (10, 10') présente une composante dirigée vers l'avant, par rapport à la personne (4). Une surface limite (14) de l'espace sépare une partie de la zone (9) de contour similaire à la personne (région commune des bras), du point de base (10, 10'). La surface limite (14) renferme le vecteur latéral (11, 11') et s'étend, à symétrie spéculaire, sur un plan vertical renfermant le vecteur latéral (11, 11'). Chaque point de la surface limite (14) est distant du point de base (10, 10'), d'au moins la longueur du vecteur latéral (11, 11'). Des régions de bras individuelles (16, 17) sont déterminées à l'intérieur de la région commune des bras. Celles-ci sont déterminées chacune par des points (18) dans un espace tridimensionnel, lesdites régions présentant, par rapport à au moins un autre point (19) de la même région de bras individuelle (16, 17), une distance inférieure à une distance minimale et, par rapport à tous les autres points (20) de la zone commune de bras, une distance supérieure à une distance maximale. La distance minimale est sensiblement égale à un module de l'espace. La distance maximale est supérieure au module. Des évaluations plus poussées des images en profondeur sont effectuées par l'intermédiaire des régions de bras individuelles (16, 17).
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| DE102011075877A DE102011075877A1 (de) | 2011-05-16 | 2011-05-16 | Auswertungsverfahren für eine Folge von zeitlich aufeinander folgenden Tiefenbildern |
| DE102011075877.1 | 2011-05-16 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2012156159A1 true WO2012156159A1 (fr) | 2012-11-22 |
Family
ID=45976392
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/EP2012/056764 Ceased WO2012156159A1 (fr) | 2011-05-16 | 2012-04-13 | Procédé d'évaluation pour une suite d'images en profondeur, en succession temporelle |
Country Status (2)
| Country | Link |
|---|---|
| DE (1) | DE102011075877A1 (fr) |
| WO (1) | WO2012156159A1 (fr) |
Families Citing this family (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| DE102013000080B4 (de) * | 2013-01-08 | 2015-08-27 | Audi Ag | Aktivierung einer Kraftfahrzeugfunktion mittels eines optischen Sensors |
| CN104765440B (zh) * | 2014-01-02 | 2017-08-11 | 株式会社理光 | 手检测方法和设备 |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2000030023A1 (fr) * | 1998-11-17 | 2000-05-25 | Holoplex, Inc. | Vision stereoscopique destinee a la reconnaissance de gestes |
| WO2002007839A2 (fr) * | 2000-07-24 | 2002-01-31 | Jestertek, Inc. | Systeme de controle d'images video |
| WO2009035705A1 (fr) * | 2007-09-14 | 2009-03-19 | Reactrix Systems, Inc. | Traitement d'interactions d'utilisateur basées sur des gestes |
-
2011
- 2011-05-16 DE DE102011075877A patent/DE102011075877A1/de not_active Withdrawn
-
2012
- 2012-04-13 WO PCT/EP2012/056764 patent/WO2012156159A1/fr not_active Ceased
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2000030023A1 (fr) * | 1998-11-17 | 2000-05-25 | Holoplex, Inc. | Vision stereoscopique destinee a la reconnaissance de gestes |
| WO2002007839A2 (fr) * | 2000-07-24 | 2002-01-31 | Jestertek, Inc. | Systeme de controle d'images video |
| WO2009035705A1 (fr) * | 2007-09-14 | 2009-03-19 | Reactrix Systems, Inc. | Traitement d'interactions d'utilisateur basées sur des gestes |
Non-Patent Citations (4)
| Title |
|---|
| "Arm gesture recognition for human-computer interaction", 1 June 2010, UNIVERSITÉ DE LIÉGE, Liége, article JULIEN GHAYE: "Arm gesture recognition for human-computer interaction", XP055034954 * |
| SUSHMITA MITRA; TINKU ACHARYA: "Gesture Recognition: A Survey", IEEE TRANSACTIONS ON SYSTEMS, MAN, AND CYBERNETICS, PART C, APPLICATIONS AND REVIEWS, vol. 37, no. 3, May 2007 (2007-05-01), pages 311 - 324, XP011176904, DOI: doi:10.1109/TSMCC.2007.893280 |
| TAYLOR J L ET AL: "Pointing", BEHAVIOURAL BRAIN RESEARCH, ELSEVIER, AMSTERDAM, NL, vol. 29, no. 1-2, 1 July 1988 (1988-07-01), pages 1 - 5, XP024333880, ISSN: 0166-4328, [retrieved on 19880701], DOI: 10.1016/0166-4328(88)90046-0 * |
| THOMAS B. MOESLUND ET AL: "A Natural Interface to a Virtual Environment through Computer Vision-estimated Pointing Gestures", 4TH INTERNATIONAL WORKSHOP ON GESTURE AND SIGN LANGUAGE BASED HUMAN-COMPUTER INTERACTION, 18 April 2001 (2001-04-18), London, XP055034737, Retrieved from the Internet <URL:http://www.cvmt.dk/~tbm/Publications/gw2001.pdf> [retrieved on 20120806] * |
Also Published As
| Publication number | Publication date |
|---|---|
| DE102011075877A1 (de) | 2012-11-22 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| DE112014003563B4 (de) | Vorrichtung und Verfahren zur Gestenbestimmung, Gestenbedienungsvorrichtung, Programm und Aufzeichnungsmedium | |
| DE602004002756T2 (de) | Bildverarbeitungsverfahren zur automatischen anpassung eines deformierbaren 3d-modells auf eine im wesentlichen röhrenförmige oberfläche eines 3d-objekts | |
| DE69231826T2 (de) | Vorrichtung zur Erkennung eines sich bewegenden Körpers | |
| DE102015111080B4 (de) | Robotervorrichtung mit maschinellem Sehen | |
| DE102020118635A1 (de) | 3D-Datenerzeugungsvorrichtung und Robotersteuersystem | |
| DE112017007903B4 (de) | Haltepositions- und Orientierungslehreinrichtung, Haltepositions- und Orientierungslehrverfahren und Robotersystem | |
| DE102021103726B4 (de) | Messparameter-Optimierungsverfahren und -vorrichtung sowie Computersteuerprogramm | |
| DE102017222057B4 (de) | Robotersystem | |
| EP2860614B1 (fr) | Procédé et dispositif de manipulation de données représentées graphiquement | |
| WO2008058783A1 (fr) | Circuit intégré destiné à la détection de mouvements de personnes | |
| DE102004018498A1 (de) | Betriebsverfahren für eine Röntgenanlage, rechnergestütztes Ermittlungsverfahren für mindestens eine 3D-Rekonstruktion eines Objekts und hiermit korrespondierende Einrichtungen | |
| WO2012156159A1 (fr) | Procédé d'évaluation pour une suite d'images en profondeur, en succession temporelle | |
| WO2020043440A1 (fr) | Éstimation directionnelle d'un geste d'espace libre | |
| DE102020104359A1 (de) | Arbeitsraumbegrenzung für einen Robotermanipulator | |
| DE102007041482A1 (de) | Verfahren zur automatischen Erkennung wenigstens der Art und/oder der Lage einer mit einer Gliedmaße gebildeten Geste, insbesondere einer Handgeste | |
| DE102016214391B4 (de) | Verfahren zum Steuern eines Betriebes eines Koordinatenmessgerätes und Koordinatenmessgerät mit Steuerung | |
| DE112021008246T5 (de) | Endoskopsystem und koordinatensystemkorrekturverfahren | |
| DE112021003715T5 (de) | Detektionsvorrichtung und Detektionsverfahren | |
| DE112021005072T5 (de) | Vorrichtung und Verfahren zur Justierung von Bildgebungsbedingung | |
| DE112021007561T5 (de) | Robotersteuerung, Lernvorrichtung und Inferenzvorrichtung | |
| DE102023205631B4 (de) | Betrieb eines Koordinatenmessgeräts mit personengeführtem Sensor | |
| DE112021007813T5 (de) | Vorrichtung zum lehren von position und haltung für einen roboter zum greifen eines werkstücks, robotersystem und verfahren | |
| EP4106959A1 (fr) | Site d'installation d'une structure mécanique | |
| DE102020200009A1 (de) | Maschinensteuervorrichtung | |
| EP3809094B1 (fr) | Procédé et agencement de visualisation des signaux de capteur d'un capteur optique d'un appareil de mesure de coordonnées ainsi que procédé et agencement de visualisation d'un capteur d'un appareil de mesure de coordonnées |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12715091 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 12715091 Country of ref document: EP Kind code of ref document: A1 |