[go: up one dir, main page]

WO2014117895A1 - Procédé et dispositif pour commander un appareil d'atelier - Google Patents

Procédé et dispositif pour commander un appareil d'atelier Download PDF

Info

Publication number
WO2014117895A1
WO2014117895A1 PCT/EP2013/076211 EP2013076211W WO2014117895A1 WO 2014117895 A1 WO2014117895 A1 WO 2014117895A1 EP 2013076211 W EP2013076211 W EP 2013076211W WO 2014117895 A1 WO2014117895 A1 WO 2014117895A1
Authority
WO
WIPO (PCT)
Prior art keywords
workshop
user
body part
stored
pattern
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/EP2013/076211
Other languages
German (de)
English (en)
Inventor
Andreas Korthauer
Christoph Noack
Hans-Udo Lebherz
Laura Heinrich-Litan
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Robert Bosch GmbH
Original Assignee
Robert Bosch GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robert Bosch GmbH filed Critical Robert Bosch GmbH
Publication of WO2014117895A1 publication Critical patent/WO2014117895A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25FCOMBINATION OR MULTI-PURPOSE TOOLS NOT OTHERWISE PROVIDED FOR; DETAILS OR COMPONENTS OF PORTABLE POWER-DRIVEN TOOLS NOT PARTICULARLY RELATED TO THE OPERATIONS PERFORMED AND NOT OTHERWISE PROVIDED FOR
    • B25F5/00Details or components of portable power-driven tools not particularly related to the operations performed and not otherwise provided for
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Definitions

  • the invention relates to a method and a device for controlling a workshop
  • Diagnosis or support of a repair of vehicles or vehicle systems used is a "factory trolley", which is used, among other things, when the workshop employee needs sufficient space for the repair of the vehicle or it is mainly stationary used diagnostic systems
  • US 2009/0210110 A1 discloses the control of a workshop device via a touch screen.
  • An object of the invention is to improve the control of a workshop device, especially in cases where no direct contact between the workshop device and the user is possible, and to make user-friendly.
  • a method according to the invention for controlling a workshop device is characterized in that it comprises the steps:
  • the position of the at least one body part can also be detected repeatedly in order to detect a movement of the body part in space (dynamic gesture).
  • the workshop apparatus can be driven in this case not only by predetermined positions of the body part (static gestures) but also by movements of one or more body parts (dynamic gesture). The possibilities of controlling the workshop device are extended in this way.
  • the method preferably also includes determining the likelihood that the user willfully performed the motion pattern most closely aligned with the sensed motion pattern to avoid accidental misoperation of the workshop apparatus. This can be done, for example, by a predetermined amount
  • a control device for a workshop device which is intended in particular for use in a motor vehicle workshop, has a memory device in which a number of movement patterns of at least one body part are stored, each movement pattern being characterized least one action of the workshop device is linked; a receiving device which is designed to contactlessly detect the position of at least one body part of a user of the workshop device in space; and an evaluation device, which is designed to carry out a method for controlling a workshop device according to one of the preceding claims.
  • a control device determines the position and movement of at least one body part (body gestures) of the user and derives therefrom control commands, which are entered into the workshop device.
  • Feedback and responses from the workshop equipment may be transmitted via an optical or acoustic output device, such as an audio output device. a screen, signal lamps and / or a loudspeaker.
  • Gesture control has, among other things, the following advantages: Since no direct contact between the user and the workshop diagnosis system is required, inputs from the user can be made even with a larger physical distance to the workshop diagnostic system, the user's path to the workshop diagnostic system and back to the vehicle can be saved. Repair tools do not have to be put out of hand during operation. In this case, the controller must be designed to be fault-tolerant of such influences; e.g. the user's hand must also be recognized when holding a repair tool, or the controller must be controlled by another body part, e.g. the head, done. As a result, the repair time and associated costs can be reduced.
  • the method includes the additional step of displaying the successful determination of the stored motion pattern having the greatest match with the detected motion pattern with an optical and / or acoustic acknowledgment signal.
  • the workshop apparatus may for this purpose have an additional output device, which is in particular designed to output an acoustic and / or optical confirmation signal when a movement pattern stored in the memory device has been detected.
  • the user friendliness is increased because the user is immediately notified when his gesture has been recognized, so that he can do without an unnecessary restitution of the gesture.
  • the user may be asked to repeat his movement if it could not be assigned with sufficient certainty to a previously stored gesture.
  • the method additionally includes identifying the user. As a further function, after an identification of the user
  • the body part whose movement is detected is a finger, a hand, an arm and / or a head of the user.
  • the head can be gestures even if the hands are used for workshop work.
  • the head and especially the face are particularly well suited to identify the user.
  • the stored motion patterns include motion patterns associated with at least one of the Next / Next Step, Previous / Previous Step, Repeat, Yes, No, Abort / End actions. With such movement patterns can be realized in conjunction with a suitable menu structure, a convenient and flexible operation of the workshop device.
  • the method additionally includes recording or storing a (new) movement pattern and linking it with an action.
  • the workshop device can be adapted in this way well to the individual needs of the user, so that the user-friendliness can be further increased.
  • the control device has an additional input device, in particular a touch screen and / or a voice input device, in order to provide a further input option in addition to the inventive input via gestures, so that the user can select the most suitable input option in the individual case.
  • the user-friendliness of the workshop device is thereby further increased.
  • the recording device may comprise at least one optical sensor (camera), in particular a stereo camera, at least one ultrasonic sensor, at least one infrared sensor or at least one radar sensor or any combination of such sensors.
  • each sensor is designed to detect at least one body part of the user without contact.
  • a camera in particular a stereo camera, allows a simple optical detection of the movement of the body part of the user.
  • An ultrasonic sensor, an infrared sensor and / or a radar sensor are advantageous, in particular, in poor light conditions, which are responsible for good optical performance
  • An optical detection of the user can, for example, with a camera system with great depth of field, such as a wide-angle lens ("fish eye"), to avoid the need for focusing to avoid.
  • a Variozoom Autofocus
  • security cameras known from security technology can be used.
  • the camera can be equipped with a motion control to track the user as they move across the workstation so that the user is always in the camera's field of vision and gestures commands to the workshop equipment.
  • the camera can be arranged above the workplace (workshop space) so that it always has the entire workplace in view.
  • the camera for recording the gesture can be used both on the workshop car (in particular in the engine and chassis diagnosis) and on a "tablet PC" (in particular in the control unit diagnosis).
  • the recording device may also comprise a lighting, an ultrasound transmitter, an infrared transmitter and / or a radar transmitter, which are each designed to irradiate at least a body part of the user with light, ultrasound, infrared or radar radiation in order to ensure a good non-contact detection of the user or his body part with a corresponding sensor to allow.
  • a device according to the invention can also be used for estimating the attention of the user by determining the viewing direction of the user to a display device of the workshop diagnostic device.
  • the display and visualization of temporary and / or particularly important information can thus be improved. For example, important temporary information may be displayed until the user turns his or her gaze to the monitor and therefore informational acquisition by the user is likely.
  • Fig. 1 shows schematically the structure of a control device according to the invention.
  • Figures 2a and 2b show a first and a second gesture.
  • Figure 3 shows a third gesture.
  • FIG. 1 shows schematically the structure of a control device according to the invention
  • the control device 1 has a receiving device 6 with a camera 6a and a lighting device 7, which is designed to take pictures of the user 2 and in particular of one of his body parts 2a, 2b.
  • the receiving device 6 with a camera 6a and a lighting device 7, which is designed to take pictures of the user 2 and in particular of one of his body parts 2a, 2b.
  • Camera 6a may in particular be designed as a stereo camera to record three-dimensional images of the user 2 or one of his body parts 2a, 2b.
  • the recording device 6 also an infrared, ultrasonic or
  • the recording device 6 is coupled to an evaluation device 8 in such a way that it transmits the image data taken by it to the evaluation device 8 during operation.
  • the evaluation device 8 identifies in the image data transmitted by the recording device 6 the user 2 and in particular the positions of at least one selected body part 2a, 2b of the user 2 and tracks the movement of the at least one selected body part 2a, 2b over time.
  • the movement of the selected body part 2a, 2b is performed with movement patterns (gestures) stored in a gesture memory 4 of the
  • Control device 1 are stored compared. In this case, that movement pattern (gesture) is determined which has the best match with the movement recorded by the receiving device 6, at least one body part 2a, 2b of the user 2.
  • At least one action of the workshop device 14 is associated with each of the gestures stored in the gesture memory 4.
  • an acknowledgment signal can be sent to the user 2 via a corresponding output device 10 which informs the user 2 that his movement has been recognized as a stored gesture.
  • the acknowledgment signal may be an optical acknowledgment signal, in particular a light signal or a display on a screen, or an acoustic acknowledgment signal.
  • the control device 1 additionally has an input device 12, for.
  • an input device 12 for example, a keyboard, a touch screen, a mouse, or a voice input device that allows the user to input commands not only by gestures but also in a conventional manner are enabled in this way to select the most appropriate type of command input in the particular situation, the ease of use of the workshop device 14 is further increased in this way.
  • the open palm of a hand 16 of the user 2 faces the pickup 6 and the hand 16 is moved left to right (Fig. 2a) or right to left (Fig. 2b).
  • the next (FIG. 2 a) or the previous step (FIG. 2 b) can be selected in a method or screen menu.
  • the open hand 16 of the user 2 is moved in a circular manner, for example, to trigger a recovery of the measurement. If necessary, a distinction can be made here between a clockwise movement and a counterclockwise movement.
  • the hand 16 of the user 2 is clenched into a fist, with the thumb pointing upwards (FIG. 4a) or downwards (FIG. 4b).
  • Such gestures may confirm an action (FIG. 4a) or abort (FIG. 4b).
  • More gestures, such as. B. crossed hands 16 for a termination of the action are possible.
  • the control device 1 can also be programmable, so that gestures of the user 2 can be recorded with the receiving device 6 and assigned to any actions of the workshop device 14 via a corresponding input in the input device 12.
  • the control device can be adapted in this way, in particular to the individual needs of the user 2, to allow this a particularly convenient operation of the workshop device 14.
  • the control device 1 may also include the function of recognizing and identifying the user 2 on the basis of his head 2a, in particular on the basis of his face, or on the basis of at least one of his gestures, and only allowing the user to control the workshop device 14 by the user 2 User 2 has been successfully identified and has appropriate authorization. In this way, the safety in the workshop is increased because the operation of the workshop device 14 is reliably prevented by unqualified personnel.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Mechanical Engineering (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Procédé pour commander un appareil d'atelier (14) comportant les étapes suivantes : détermination sans contact d'au moins une position d'au moins une partie corporelle (2a, 2b, 16, 18) d'un utilisateur (2) de l'appareil d'atelier ; comparaison de la position détectée à des modèles de position enregistrés ; détermination du modèle de position enregistré présentant le plus grand taux de correspondance avec la position détectée ; exécution d'une action combinée au modèle de position ainsi déterminé.
PCT/EP2013/076211 2013-01-29 2013-12-11 Procédé et dispositif pour commander un appareil d'atelier Ceased WO2014117895A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE201310201359 DE102013201359A1 (de) 2013-01-29 2013-01-29 Verfahren und Vorrichtung zum Steuern eines Werkstattgeräts
DE102013201359.0 2013-01-29

Publications (1)

Publication Number Publication Date
WO2014117895A1 true WO2014117895A1 (fr) 2014-08-07

Family

ID=49759300

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2013/076211 Ceased WO2014117895A1 (fr) 2013-01-29 2013-12-11 Procédé et dispositif pour commander un appareil d'atelier

Country Status (2)

Country Link
DE (1) DE102013201359A1 (fr)
WO (1) WO2014117895A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3037917A1 (fr) * 2014-12-24 2016-06-29 Nokia Technologies OY Surveillance
CN110650821A (zh) * 2017-03-30 2020-01-03 罗伯特·博世有限公司 工具机器

Families Citing this family (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9575560B2 (en) 2014-06-03 2017-02-21 Google Inc. Radar-based gesture-recognition through a wearable device
US9921660B2 (en) 2014-08-07 2018-03-20 Google Llc Radar-based gesture recognition
US9811164B2 (en) 2014-08-07 2017-11-07 Google Inc. Radar-based gesture sensing and data transmission
US9588625B2 (en) 2014-08-15 2017-03-07 Google Inc. Interactive textiles
US10268321B2 (en) 2014-08-15 2019-04-23 Google Llc Interactive textiles within hard objects
US9778749B2 (en) 2014-08-22 2017-10-03 Google Inc. Occluded gesture recognition
US11169988B2 (en) 2014-08-22 2021-11-09 Google Llc Radar recognition-aided search
US9600080B2 (en) * 2014-10-02 2017-03-21 Google Inc. Non-line-of-sight radar-based gesture recognition
US10016162B1 (en) 2015-03-23 2018-07-10 Google Llc In-ear health monitoring
US9983747B2 (en) 2015-03-26 2018-05-29 Google Llc Two-layer interactive textiles
WO2016176574A1 (fr) 2015-04-30 2016-11-03 Google Inc. Reconnaissance de gestes fondée sur un radar à champ large
KR102229658B1 (ko) 2015-04-30 2021-03-17 구글 엘엘씨 타입-애그노스틱 rf 신호 표현들
CN107430444B (zh) 2015-04-30 2020-03-03 谷歌有限责任公司 用于手势跟踪和识别的基于rf的微运动跟踪
US9693592B2 (en) 2015-05-27 2017-07-04 Google Inc. Attaching electronic components to interactive textiles
US10088908B1 (en) 2015-05-27 2018-10-02 Google Llc Gesture detection and interactions
DE102015212028A1 (de) * 2015-06-29 2016-12-29 Robert Bosch Gmbh Bedienvorrichtung für eine Handwerkzeugmaschine
US10817065B1 (en) 2015-10-06 2020-10-27 Google Llc Gesture recognition using multiple antenna
EP3371855A1 (fr) 2015-11-04 2018-09-12 Google LLC Connecteurs pour connecter des éléments électroniques incorporés dans des vêtements à des dispositifs externes
WO2017192167A1 (fr) 2016-05-03 2017-11-09 Google Llc Connexion d'un composant électronique à un textile interactif
WO2017200570A1 (fr) 2016-05-16 2017-11-23 Google Llc Objet interactif à modules électroniques multiples
US10579150B2 (en) 2016-12-05 2020-03-03 Google Llc Concurrent detection of absolute distance and relative movement for sensing action gestures

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100231509A1 (en) * 2009-03-12 2010-09-16 Marc Boillot Sterile Networked Interface for Medical Systems
US20120075463A1 (en) * 2010-09-23 2012-03-29 Sony Computer Entertainment Inc. User interface system and method using thermal imaging
WO2012099584A1 (fr) * 2011-01-19 2012-07-26 Hewlett-Packard Development Company, L.P. Procédé et système de commande multimode et gestuelle
WO2012104772A1 (fr) * 2011-02-04 2012-08-09 Koninklijke Philips Electronics N.V. Système pouvant être commandé par des gestes et utilisant la proprioception pour créer un cadre de référence absolu
US20130024819A1 (en) * 2011-07-18 2013-01-24 Fuji Xerox Co., Ltd. Systems and methods for gesture-based creation of interactive hotspots in a real world environment

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8239087B2 (en) 2008-02-14 2012-08-07 Steering Solutions Ip Holding Corporation Method of operating a vehicle accessory

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100231509A1 (en) * 2009-03-12 2010-09-16 Marc Boillot Sterile Networked Interface for Medical Systems
US20120075463A1 (en) * 2010-09-23 2012-03-29 Sony Computer Entertainment Inc. User interface system and method using thermal imaging
WO2012099584A1 (fr) * 2011-01-19 2012-07-26 Hewlett-Packard Development Company, L.P. Procédé et système de commande multimode et gestuelle
WO2012104772A1 (fr) * 2011-02-04 2012-08-09 Koninklijke Philips Electronics N.V. Système pouvant être commandé par des gestes et utilisant la proprioception pour créer un cadre de référence absolu
US20130024819A1 (en) * 2011-07-18 2013-01-24 Fuji Xerox Co., Ltd. Systems and methods for gesture-based creation of interactive hotspots in a real world environment

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3037917A1 (fr) * 2014-12-24 2016-06-29 Nokia Technologies OY Surveillance
WO2016102768A1 (fr) * 2014-12-24 2016-06-30 Nokia Technologies Oy Surveillance
EP3800532A1 (fr) * 2014-12-24 2021-04-07 Nokia Technologies Oy Surveillance
US11429189B2 (en) 2014-12-24 2022-08-30 Nokia Technologies Oy Monitoring
CN110650821A (zh) * 2017-03-30 2020-01-03 罗伯特·博世有限公司 工具机器
CN110650821B (zh) * 2017-03-30 2022-11-11 罗伯特·博世有限公司 工具机器

Also Published As

Publication number Publication date
DE102013201359A1 (de) 2014-07-31

Similar Documents

Publication Publication Date Title
WO2014117895A1 (fr) Procédé et dispositif pour commander un appareil d'atelier
EP2048557B1 (fr) Capteur optoélectronique et dispositif mobile ainsi que son procédé de configuration
EP3486915B1 (fr) Procédé de commande de fonctionnement d'un dispositif technique médicale, appareil de commande, système de commande et dispositif technique médicale
DE102017120614B4 (de) Robotersystem mit einem Programmierhandgerät, das mit einer Robotersteuerung kommuniziert
DE102008034237B4 (de) Positionierungssystem für die transkranielle Magnetstimulation
EP3929675B1 (fr) Système de surveillance et de commande pour un poste de travail de production et procédé de fabrication d'un produit ou d'un sous-produit
EP3383598B1 (fr) Système manipulateur et procédé d'identification de dispositifs de commande
EP3025223A1 (fr) Procédé et dispositif de commande à distance d'une fonction d'un véhicule automobile
EP2577414B1 (fr) Procédé et système de commande pour programmer ou prescrire des mouvements ou processus d'un robot industriel
DE102017108194A1 (de) Verfahren zum Betrieb eines sich selbsttätig fortbewegenden Fahrzeugs
EP3366434B1 (fr) Procédé de vérification d'une fonction d'un véhicule et/ou d'au moins un dispositif de commande
EP3562730B1 (fr) Procédé pour garer automatiquement un véhicule
DE102018220693B4 (de) Steuerungssystem und Verfahren zum Steuern einer Funktion eines Fahrzeugs, sowie Fahrzeug mit einem solchen
DE10215885A1 (de) Automatische Prozesskontrolle
DE102018202995A1 (de) Verfahren und Vorrichtungen zum automatischen Prüfen wenigstens einer Funktion eines elektronischen Geräts
DE102013000066A1 (de) Zoomen und Verschieben eines Bildinhalts einer Anzeigeeinrichtung
EP4283061A1 (fr) Système d'inspection de tuyau de canal et procédé de commande d'un système d'inspection de tuyau de canal
DE102019206606A1 (de) Verfahren zur berührungslosen Interaktion mit einem Modul, Computerprogrammprodukt, Modul sowie Kraftfahrzeug
DE19918072A1 (de) Bedienverfahren und Bedienvorrichtung für einen bildschirmgesteuerten Prozeß
EP1487616B1 (fr) Commande de processus automatique
EP3302895A1 (fr) Procédé pour déterminer un point de trajectoire
DE102013219511B4 (de) Verfahren und Vorrichtung zum Betreiben einer Eingabevorrichtung eines Bediensystems, Kraftfahrzeug, Computerprogramm, Computer-Programmprodukt
DE102016221861B4 (de) Einrichtung und Verfahren zur Einwirkung auf Gegenstände
DE112019007321B4 (de) Kopplungsanzeigevorrichtung, kopplungsanzeigesystem und kopplungsanzeigeverfahren
US9410980B2 (en) Work monitoring system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13802979

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 13802979

Country of ref document: EP

Kind code of ref document: A1