[go: up one dir, main page]

EP3298475A1 - Système de commande et procédé pour faire fonctionner un système de commande pour un véhicule à moteur - Google Patents

Système de commande et procédé pour faire fonctionner un système de commande pour un véhicule à moteur

Info

Publication number
EP3298475A1
EP3298475A1 EP16722047.4A EP16722047A EP3298475A1 EP 3298475 A1 EP3298475 A1 EP 3298475A1 EP 16722047 A EP16722047 A EP 16722047A EP 3298475 A1 EP3298475 A1 EP 3298475A1
Authority
EP
European Patent Office
Prior art keywords
operating system
detection
control
gesture
operating
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP16722047.4A
Other languages
German (de)
English (en)
Inventor
Paul Sprickmann Kerkerinck
Onofrio DI FRANCO
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Audi AG
Original Assignee
Audi AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Audi AG filed Critical Audi AG
Publication of EP3298475A1 publication Critical patent/EP3298475A1/fr
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/10Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/65Instruments specially adapted for specific vehicle types or users, e.g. for left- or right-hand drive
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3664Details of the user input interface, e.g. buttons, knobs or sliders, including those provided on a touch screen; remote controllers; input using gestures
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/165Management of the audio stream, e.g. setting of volume, audio stream path
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/59Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/107Static hand or arm
    • G06V40/113Recognition of static hand signs
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/146Instrument input by gesture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/146Instrument input by gesture
    • B60K2360/14643D-gesture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/21Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
    • B60K35/22Display screens
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/26Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using acoustic output

Definitions

  • the invention relates to an operating system for a motor vehicle.
  • the operating system comprises a detection device which is designed to detect at least a body part of a user when it is arranged in a detection region of the detection device.
  • the operating system comprises a control device, which is designed to control a signaling device of the operating system.
  • Operating systems are known from the prior art, which are controlled by gestures of a user within a detection area.
  • the user can be signaled or displayed by a light cone of the detection area or by a light beam at least one system boundary of the detection area.
  • DE 10 2013 009 567 A1 discloses a method for operating a gesture recognition device in a motor vehicle.
  • the gesture recognition device performs gesture recognition only if a gesticulating hand is located in a predetermined partial volume of the vehicle interior. Further, the operator is assisted in locating the partial volume by a light source is irradiated by a light beam in the sub-volume. For this purpose, the light beam hits the hand of the operator when it is arranged in the sub-volume.
  • US 2014/0361989 A1 describes a method for operating functions in a vehicle using gestures executed in three-dimensional space. In a method step, it is determined whether or not a first gesture performed in the three-dimensional space is detected by an image-based detection process. In another Step is determined whether or not the first gesture is a gesture associated with activating control of a function. When the corresponding gestures are detected, a function associated with the gesture is executed.
  • DE 10 2013 012 466 A1 discloses an operating system for one or more vehicle-mounted device (s).
  • the operating system comprises a sensor device for detecting image data of a hand located in the detection range of the sensor device and for transmitting the acquired image data to an evaluation device.
  • the operating system includes a storage device in which actions associated with hand gestures and hand gestures are stored, an evaluation device that evaluates the image data, and a controller that executes the action associated with a hand gesture.
  • the object of the present invention is to support a user particularly well in the operation of a gesture-controlled operating system and thereby to increase the reliability of the gesture operation.
  • This object is achieved by an operating system for a motor vehicle and a method for operating an operating system according to the independent patent claims.
  • Advantageous embodiments of the invention can be found in the dependent claims.
  • this object is achieved by an operating system for a motor vehicle.
  • the operating system comprises a detection device which is designed to detect at least a body part of a user when it is arranged in a detection region of the detection device.
  • detection area is meant a spatial area in the environment of the detection device, which can be accommodated by the detection device.
  • the detection device may comprise a camera, in particular a 2D camera and / or a 3D camera.
  • the detection range of the camera can be determined, for example, by the viewing angle of the camera.
  • the operating system has a control device which is designed to control a signaling device of the operating system.
  • the operating system is characterized in that the detection device is designed to check whether the detected in the detection range Body part is located in a control room, which forms a portion of the detection area.
  • the operating space is thus arranged within the detection range of the detection device.
  • operating space is meant, for example, a subspace within the detectable by the detection device spatial area.
  • the detection device can thus be located in a predetermined environment and in this detect a predetermined spatial area - the so-called detection area - in which in turn a defined by the detection device control room is arranged as a subspace of the spatial area.
  • the operating system is also distinguished by the fact that the control device is designed to control the signaling device in such a way that feedback is output by means of the latter outside the control room when it is detected by the detection device that the body part is located within the control room.
  • the control device is designed to control the signaling device in such a way that feedback is output by means of the latter outside the control room when it is detected by the detection device that the body part is located within the control room.
  • the control device is designed to control the signaling device in such a way that feedback is output by means of the latter outside the control room when it is detected by the detection device that the body part is located within the control room.
  • the control device is designed to control the signaling device in such a way that feedback is output by means of the latter outside the control room when it is detected by the detection device that the body part is located within the control room.
  • the control device can control the signaling device accordingly, which can then output a feedback.
  • the data transmission can be wireless or wired.
  • the signaling device may comprise a display device and / or an acoustic output device.
  • the display device may have, for example, one or more lights and / or a display, so that optical signals can be displayed by means of the display device.
  • the acoustic output device may, for example, have one or more loudspeakers, so that acoustic signals can be output by means of the acoustic output device.
  • a feedback can be, for example, the illumination of the luminaire in the form of an optical signal and / or an acoustic signal from a loudspeaker.
  • the object is also achieved by a method for operating an operating system for a motor vehicle.
  • a method for operating an operating system for a motor vehicle first of all a body part of a user is detected by means of a detection device when it is arranged in a detection area of the detection device. Furthermore, it is checked whether the body part detected in the detection area is located in a control room, which forms a partial area of the detection area. Subsequently, a signaling device of the operating system is controlled by means of a control device such that a feedback is output by means of the signaling device outside the control room, if it is detected by the detection device that the body part is located within the control room.
  • the detecting means is further adapted to detect a gesture in the operating room and to check whether the detected gesture matches at least a predetermined gesture. It is not only the presence of a body part in The gesture can, for example, be evaluated on the basis of video data in the case of a camera, resulting in the advantage that the detection device not only outputs a direct feedback when detecting a body part, but at the same time designed to evaluate gestures.
  • control device is designed to control the signal device such that an acknowledgment signal is output by means of the signal device if the detected gesture coincides with the at least one predetermined gesture.
  • the confirmation signal can be output as an optical and / or acoustic signal by means of the signaling device. So, as already mentioned, for example, the hand of the user detected in the control room, the light turns yellow. If it is subsequently detected that the user is making a predetermined gesture with his hand, the color of the luminaire can change from yellow to green as a confirmation signal. The user thus knows that his currently executed gesture has been recognized as an operator gesture.
  • predetermined gesture is meant, for example, a gesture stored in a memory device of the detection device, ie, the given gestures can be stored or stored, for example, in a memory device of the detection device through the Camera captured video data to be compared with stored video data, ie stored video data. If the recorded video data match the stored video data, a gesture will be recognized as a given gesture. This results in the advantage that the user of the operating system receives a direct feedback in a particularly simple manner when operating the operating system.
  • the control device is adapted to control the signaling device such that by means of the signaling device, a warning message is issued if the detected gesture deviates from the at least one predetermined gesture.
  • the warning signal can be output as an optical and / or acoustic signal by means of the signaling device. If, for example, after detecting a gesture of the user, for example by gesturing with his hand, no predetermined gesture detected, so the light of the signaling device instead of yellow to green, for a given gesture, from yellow to red in a faulty operation, ie one unspecified gesture, change.
  • the camera captured video data can compare with deposited video data. If the captured video data does not match the stored video data, a gesture is detected as a misoperation. This has the advantage that a user can react directly to it if he has operated the operating system wrong.
  • the signaling device may also comprise two display devices in the form of two lights, which may be arranged side by side.
  • the first light can, for example, display an optical feedback when detecting the body part in the control room, that is, the light would shine yellow.
  • the second lamp may output, for example, an optical confirmation signal and / or an optical warning signal. If the user intervenes, for example, in the control room, then this is indicated by the first light, which lights yellow at this moment and as long as the user is in the control room with his hand. For example, if the user subsequently makes a gesture, and the gesture is captured as a predetermined gesture, the second light turns green.
  • the signaling device may include, for example, a lamp and a speaker, so that the optical signal of the lamp is amplified by an acoustic signal.
  • the signaling device may preferably have a display with an illuminated frame.
  • a translucent frame with a light source in the form of a light guide can be arranged around an outer contour of the display. This allows the user to track his actual operations on the display.
  • the user By the illuminable frame, the user, as already described in the example of the lamp, understand his detected operating action, so if he is, for example, in the control room and if his gesture was done right or wrong.
  • control device is configured to control the signal device in such a way that a detection signal is output by means of the signal device, if it is detected by the detection device that the body part is indeed within the detection range but outside the operating space.
  • the detection signal can be output as an optical and / or acoustic signal by means of the signaling device.
  • the user is indicated by means of the signaling device that, while he is in the detection area of the detection device, he is not yet in the control room if he wishes to operate the operating system. This results in the advantage that a user can easily check the position of his hand when operating the operating system.
  • the stability and reliability of the gesture recognition can be increased thereby, since the detection device can already detect and track a body part of the user, for example an arm or a hand or a finger, outside the control room and thereby make the penetration of the body part into the operating space more reliable. can tektieren.
  • control device is designed to change the expansion and / or arrangement of the control room in dependence on at least one predetermined criterion.
  • predetermined criterion For example, be a stored user profile, eg on the vehicle key.
  • the stored user profile may include, for example, the size of the driver and / or a preferred seating position.
  • the extent of the control room can be adapted to the seat position of the user in the motor vehicle.
  • the operating room is adapted to the arm length of the user so that he can reach into the operating room when extending his arm. If the user changes and the seat position is shifted from the first seat position away from the operating space into a second seat position, the user might no longer be able to reach the operating space in the second seat position.
  • the extent of the control room can be adapted, in this case increased, as a function of the seating position. That is, the extent of the operating space may increase, the farther the seat, for example, away from the steering wheel. This results in the advantage that individually the operating room can be adapted for each user of the operating system. The simultaneous support of the driver when operating the operating system results in a comfortable operating system for every user.
  • Figure 1 is a schematic representation of the operating system in an interior of a motor vehicle.
  • Fig. 2 is a schematic representation of a detection device and a recorded by her detection area with an operating space arranged therein.
  • an operating system 14 is shown schematically in an interior 12 of a motor vehicle 10.
  • the motor vehicle 10 may be a motor vehicle, in particular a passenger car.
  • the mode of operation of the operating system 14 will now be explained below in conjunction with FIG. 2, which schematically represents a detection device 16 of the operating system from FIG. 1.
  • the operating system 14 comprises the detection device 16, which is designed to detect a body part 18 of a user and a gesture of the body part 18, indicated in FIG. 1 by the double arrow.
  • the detection device 16 may comprise, for example, a camera.
  • the camera can be arranged, for example, on a ceiling of the motor vehicle 10 directly above a center console in the interior 12 of the motor vehicle 10.
  • the camera spans a detection area 24, in which an object and / or a body part 18 can be detected.
  • detection area 24 is meant a spatial area in the vicinity of the detection device 16, ie in the interior space 12 of the motor vehicle 10, which can be accommodated by the detection device 16.
  • an operating space 26 is provided within the detection area 24, an operating space 26 is provided.
  • the operating space 26 is thus a subspace of the detection area 24.
  • the detection facility 16 is located in an interior space 12 of the motor vehicle 10 and accommodates therein a predetermined spatial area - the detection area 24 - in which an operating space 26 defined by the detection facility 16 is again is arranged as a subspace of the detection area 24.
  • the operating space 26 can, as shown in FIG.
  • the operating system 14 further comprises a control device 20 and a signaling device 22. Depending on where and what the detection device 16 detects within the detection region 24, the control device 20 controls the signaling device 22 differently.
  • the detection device 16, the control device 20 and the signaling device 22 are coupled or connected wirelessly or wired accordingly.
  • the detection device 16, for example the camera is designed in a manner known per se to detect an object and / or a body part 18 of a user, for example a hand.
  • Gestures for example the gesturing of a hand, can also be detected by the detection device 16.
  • the detection device 16 can detect, for example, by image data and / or video data, ie a sequence of image data, gestures and / or a body part 18.
  • the control device 20 can control the signaling device 22 differently, depending on where and what is detected within the detection region 24 by the detection device 16.
  • the signaling device 22 may comprise, for example, a display device and / or an acoustic output device.
  • the display device may, for example, one or more lights and / or a display, which is arranged for example in an instrument cluster of the motor vehicle 10, have, so that by means of the display device optical signals can be displayed.
  • the acoustic output device can, for example, have one or more loudspeakers, so that acoustic signals can be output by means of the acoustic output device.
  • the signal device 22 comprises, for example, a display device, then the display device is preferably arranged in a field of view of the user.
  • the control device 20 distinguishes between four different display modes. Depending on where and what the detection device 16 detects within the detection range 24, the signal device 22 is controlled differently by means of the control device 20. If the detection device 16 detects a body part 18 of the user in the detection area 24 but outside the operating room 26, the signal device 22 is operated in a first display mode. The signaling device 22 is operated in a second display mode if the detection device 16 detects the body part 18 of the user in the control room 26. summarizes. If the detection device 24 detects a predetermined gesture of the user in the control room 26, the signal device 22 is operated in a third display mode. For this purpose, the detection device 16 compares, for example, stored video data with recorded video data.
  • the gesture made by the user is a given gesture. Detects the detection device 16 in an operating space 26 a deviating from a predetermined operating gesture gesture, ie a misuse of the user, the signaling device 22 is operated in a fourth display mode. A deviating gesture or incorrect operation is detected if the captured video data deviates from the stored video data.
  • the control device 20 controls the signal device 22 in such a way that a detection signal is output by means of the signal device 22.
  • the user is shown that he is indeed with his hand in the detection area 24 of the detection device 16, but not yet in the control room 26, if he wants to operate the operating system 14.
  • the signal device 22 is, for example, a luminaire, then the luminaire can light up yellow as a detection signal.
  • the control device 20 activates the signaling device 22 in such a way that a feedback message is output by means of the signaling device 22 when the hand is in the operating space 26.
  • a feedback message is output by means of the signaling device 22 when the hand is in the operating space 26.
  • the light in the form of an optical signal can light up orange.
  • the control device 20 activates the signal device 22 such that an acknowledgment signal is output by means of the signal device 22.
  • a gesture is detected in the control room 26 and this coincides with at least one predetermined gesture.
  • the control device 20 controls the signaling device 22 such that a warning message is output by means of the signaling device 22.
  • the light of the signal device 22 instead of from yellow to green for a given gesture from yellow to red in the event of a faulty operation, So a non-default gesture, change.
  • the signaling device 22 may also include three lights, which may be arranged side by side in the field of view of the user.
  • the first lamp may output the detection signal in the first display mode.
  • the second luminaire may, for example, output the visual feedback in the second display mode.
  • a confirmation signal or a warning signal can be output. If the user intervenes, for example, in the detection area 16 with his or her hand, this is indicated by the first light (first display mode), which lights up yellow at this moment and as long as the user is in the detection area 16 with his hand.
  • the second light (second display mode), which lights orange at this moment and as long as the user is in the control room 26 with his hand.
  • the third light (third display mode), which, for example, green lights. If the user executes a faulty operation on the given gesture, this is also signaled to him by the third light, in that the color of the light changes from green to red (fourth display mode). Leaves the user with his hand the control room 26 and the detection area 24, all three lights would go out one after the other.
  • the signal device 22 may comprise, for example, a loudspeaker.
  • the feedback and / or the confirmation signal and / or the warning message and / or the detection signal can be output as an acoustic signal.
  • the adaptation of the control room 26 is indicated schematically in Fig. 2 by the arrows around the control room 26.
  • the operating space 26 can only be "extended” to the extent that the system limits of the detection range-indicated by the dashed lines-permit.The extent of the operating space 26 can not be increased beyond the limits of the detection range 24.
  • a gesture-based control of the operating system 14 is considerably simplified.
  • a user is given a visual and / or audible feedback as to whether, for example, he has positioned his hand correctly in order to be able to carry out a gesture operation at all.
  • the user is also given a visual and / or audible feedback as to whether he has just made a gesture with his hand that has been recognized by the operating system 14 or not.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • General Engineering & Computer Science (AREA)
  • Transportation (AREA)
  • Chemical & Material Sciences (AREA)
  • Mechanical Engineering (AREA)
  • Combustion & Propulsion (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Automation & Control Theory (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un système de commande (14) pour un véhicule à moteur (10). Le système de commande (14) comporte un dispositif de détection (16) conçu pour détecter au moins une partie du corps (18) d'un utilisateur lorsque celle-ci se trouve dans une zone de détection (24) du dispositif de détection (16). Le système de commande (14) comporte également un dispositif de commande (20) conçu pour commander un dispositif de signal (22) du système de commande (14). Le système de commande (14) est caractérisé en ce que le dispositif de détection (16) est conçu pour contrôler si la partie du corps (18) détectée dans la zone de détection (24) se trouve dans un espace de commande (26) formant une partie de la zone de détection (24). Le dispositif de commande (20) est par ailleurs conçu pour commander le dispositif de signal (22) de telle manière que celui-ci émet une information en retour en dehors de l'espace de commande (26) lorsque le dispositif de détection (16) détecte que la partie du corps (18) se trouve dans l'espace de commande (26). L'invention concerne également un procédé pour faire fonctionner un système de commande (14) pour un véhicule à moteur (10).
EP16722047.4A 2015-05-21 2016-04-15 Système de commande et procédé pour faire fonctionner un système de commande pour un véhicule à moteur Withdrawn EP3298475A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102015006613.7A DE102015006613A1 (de) 2015-05-21 2015-05-21 Bediensystem und Verfahren zum Betreiben eines Bediensystems für ein Kraftfahrzeug
PCT/EP2016/000617 WO2016184539A1 (fr) 2015-05-21 2016-04-15 Système de commande et procédé pour faire fonctionner un système de commande pour un véhicule à moteur

Publications (1)

Publication Number Publication Date
EP3298475A1 true EP3298475A1 (fr) 2018-03-28

Family

ID=55967203

Family Applications (1)

Application Number Title Priority Date Filing Date
EP16722047.4A Withdrawn EP3298475A1 (fr) 2015-05-21 2016-04-15 Système de commande et procédé pour faire fonctionner un système de commande pour un véhicule à moteur

Country Status (5)

Country Link
US (1) US10599226B2 (fr)
EP (1) EP3298475A1 (fr)
CN (1) CN107454948A (fr)
DE (1) DE102015006613A1 (fr)
WO (1) WO2016184539A1 (fr)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102017201236B4 (de) * 2017-01-26 2023-09-07 Volkswagen Aktiengesellschaft Verfahren zum Betreiben eines Bediensystems, Bediensystem und Fahrzeug mit einem Bediensystem
US11314976B2 (en) 2019-03-15 2022-04-26 Lg Electronics Inc. Vehicle control device
DE102024101664A1 (de) * 2024-01-22 2024-11-21 Dr. Ing. H.C. F. Porsche Aktiengesellschaft Computerimplementiertes Verfahren zum Konfigurieren mindestens eines Fahrzeugs
US12326970B1 (en) * 2024-04-16 2025-06-10 DISTANCE TECHNOLOGIES Oy Refining head pose tracking based on seat position and orientation

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3260331A1 (fr) * 2015-02-20 2017-12-27 Clarion Co., Ltd. Dispositif de traitement d'informations

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6266061B1 (en) * 1997-01-22 2001-07-24 Kabushiki Kaisha Toshiba User interface apparatus and operation range presenting method
DE10039432C1 (de) 2000-08-11 2001-12-06 Siemens Ag Bedieneinrichtung
US8745541B2 (en) * 2003-03-25 2014-06-03 Microsoft Corporation Architecture for controlling a computer using hand gestures
US9311528B2 (en) * 2007-01-03 2016-04-12 Apple Inc. Gesture learning
US8726194B2 (en) * 2007-07-27 2014-05-13 Qualcomm Incorporated Item selection using enhanced control
JP4318056B1 (ja) * 2008-06-03 2009-08-19 島根県 画像認識装置および操作判定方法
US8547327B2 (en) * 2009-10-07 2013-10-01 Qualcomm Incorporated Proximity object tracker
JP5617581B2 (ja) * 2010-12-08 2014-11-05 オムロン株式会社 ジェスチャ認識装置、ジェスチャ認識方法、制御プログラム、および、記録媒体
DE102012000263A1 (de) 2012-01-10 2013-07-11 Daimler Ag Verfahren und Vorrichtung zum Bedienen von Funktionen in einem Fahrzeug unter Verwendung von im dreidimensionalen Raum ausgeführten Gesten sowie betreffendes Computerprogrammprodukt
DE102012216193B4 (de) * 2012-09-12 2020-07-30 Continental Automotive Gmbh Verfahren und Vorrichtung zur Bedienung einer Kraftfahrzeugkomponente mittels Gesten
DE102012021220A1 (de) * 2012-10-27 2014-04-30 Volkswagen Aktiengesellschaft Bedienanordnung für ein Kraftfahrzeug
DE102013000081B4 (de) * 2013-01-08 2018-11-15 Audi Ag Bedienschnittstelle zum berührungslosen Auswählen einer Gerätefunktion
US20140267004A1 (en) * 2013-03-13 2014-09-18 Lsi Corporation User Adjustable Gesture Space
DE102013009567B4 (de) 2013-06-07 2015-06-18 Audi Ag Verfahren zum Betreiben einer Gestenerkennungseinrichtung sowie Kraftfahrzeug mit räumlich beschränkter Gestenerkennung
CN103303224B (zh) 2013-06-18 2015-04-15 桂林电子科技大学 车载设备手势控制系统及其使用方法
DE102013012466B4 (de) 2013-07-26 2019-11-07 Audi Ag Bediensystem und Verfahren zum Bedienen einer fahrzeugseitigen Vorrichtung
US9785243B2 (en) * 2014-01-30 2017-10-10 Honeywell International Inc. System and method for providing an ergonomic three-dimensional, gesture based, multimodal interface for use in flight deck applications
US9939912B2 (en) * 2014-03-05 2018-04-10 Denso Corporation Detection device and gesture input device

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3260331A1 (fr) * 2015-02-20 2017-12-27 Clarion Co., Ltd. Dispositif de traitement d'informations

Also Published As

Publication number Publication date
WO2016184539A1 (fr) 2016-11-24
US20180150141A1 (en) 2018-05-31
DE102015006613A1 (de) 2016-11-24
CN107454948A (zh) 2017-12-08
US10599226B2 (en) 2020-03-24

Similar Documents

Publication Publication Date Title
EP2673156B1 (fr) Méthode, dispositif et un produit programme d'ordinateur pour contrôler une unité fonctionnelle d'un véhicule
DE102017122866A1 (de) Parkunterstützungsvorrichtung
WO2014206543A1 (fr) Procédé et dispositif de commande à distance d'une fonction d'un véhicule
DE102011054848B4 (de) Steuer- und Überwachungseinrichtung für Fahrzeuge
DE102013213064A1 (de) Verfahren und Vorrichtung zum autonomen Einparken eines Fahrzeugs mit externer Überwachung
EP1830244A2 (fr) Procédé et dispositif destinés à l'utilisation d'au moins deux composants fonctionnels d'un système, en particulier d'un véhicule
WO2014040930A1 (fr) Procédé et dispositif de commande d'un composant d'automobile au moyen de gestes
EP3298475A1 (fr) Système de commande et procédé pour faire fonctionner un système de commande pour un véhicule à moteur
EP3094533A2 (fr) Fonctionnement d'un véhicule selon le souhait d'un occupant du véhicule
DE102018205753A1 (de) Verfahren, Vorrichtung und Fortbewegungsmittel für ein automatisiertes Anfahren eines Fortbewegungsmittels an einer Lichtsignalanlage
DE102017204916A1 (de) Verfahren zum Durchführen eines automatischen Fahrvorgangs eines Kraftfahrzeugs unter Verwendung einer Fernbedienung
DE102019206696A1 (de) Verfahren zur geführten Fahrzeugübergabe bei automatisiertem Valet Parken
WO2016091736A1 (fr) Dispositif de détection pour connaître un geste et/ou une direction du regard d'un occupant d'un véhicule par commande synchrone d'unités d'éclairage
DE102017103391A1 (de) Verfahren zur Verbesserung der Benutzerfreundlichkeit eines Fahrzeugs
DE10307477A1 (de) Außenspiegelsteuerung
DE102005023697A1 (de) Einrichtung zur Steuerung der Innenbeleuchtung eines Kraftfahrzeugs
DE102013202072A1 (de) Begrüßungsszenarien von Kraftfahrzeugen
DE102022126772A1 (de) Anwenderschnittstelle, Fortbewegungsmittel und Verfahren zur Interaktion mit einem Fortbewegungsmittel
DE102016011016A1 (de) Verfahren zum Betrieb eines Assistenzsystems
EP4143074B1 (fr) Procédé de fonctionnement d'un véhicule automobile, et véhicule automobile
DE102014202650A1 (de) Verfahren und Vorrichtung zum Bedienen der Mechanik einer motorisch positionsveränderlichen Anzeigeeinheit
DE102018208402A1 (de) Funktionsüberprüfungssystem
DE102013212011A1 (de) Verfahren zur Lichtsteuerung eines Fahrzeugs
DE10126238A1 (de) Vorrichtung und Verfahren zur Einstellung mindestens eines Rückspiegels
DE102018114515A1 (de) Fahrerassistenzsystem für ein Kraftfahrzeug und Verfahren zu dessen Betrieb

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20171221

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

RIN1 Information on inventor provided before grant (corrected)

Inventor name: DI FRANCO, ONOFRIO

Inventor name: SPRICKMANN KERKERINCK, PAUL

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN WITHDRAWN

17Q First examination report despatched

Effective date: 20190308

18W Application withdrawn

Effective date: 20190326