[go: up one dir, main page]

WO2017032549A1 - Procédé de déclenchement d'au moins une étape de traitement par attribution d'un élément d'information à un appareil - Google Patents

Procédé de déclenchement d'au moins une étape de traitement par attribution d'un élément d'information à un appareil Download PDF

Info

Publication number
WO2017032549A1
WO2017032549A1 PCT/EP2016/068154 EP2016068154W WO2017032549A1 WO 2017032549 A1 WO2017032549 A1 WO 2017032549A1 EP 2016068154 W EP2016068154 W EP 2016068154W WO 2017032549 A1 WO2017032549 A1 WO 2017032549A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
information element
display
gesture
displays
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/EP2016/068154
Other languages
German (de)
English (en)
Inventor
Bernd Weymann
Karl-Heinz Knobl
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Continental Automotive GmbH
Original Assignee
Continental Automotive GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Continental Automotive GmbH filed Critical Continental Automotive GmbH
Publication of WO2017032549A1 publication Critical patent/WO2017032549A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/10Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/21Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
    • B60K35/22Display screens
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/29Instruments characterised by the way in which information is handled, e.g. showing information on plural displays or prioritising information according to driving conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/143Touch sensitive instrument input devices
    • B60K2360/1438Touch screens
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/146Instrument input by gesture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/18Information management
    • B60K2360/182Distributing information between displays
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/18Information management
    • B60K2360/184Displaying the same information on different displays

Definitions

  • the invention relates to a method and apparatus for processing information elements, in particular to a method of triggering at least one on an information ⁇ element applicable processing step in a system with multiple arranged at different spatial positions displays, for example, in a motor vehicle cockpit.
  • an information element called ⁇ an executable computer program or a computer program executable associated content.
  • content associated with an executable computer program includes one or more text, image, audio, or video elements, singly or in combination.
  • text items are messages or status information that can be captured and processed by a user or computer.
  • a processing step to be applied to an information element includes, among other things, the extraction of information and / or computer-executable program instructions from the information element, the forwarding of information extracted from the information element and / or program instructions to another process step or to another, the information verar ⁇ beitendes device, or a display of information associated with the information element after previous execution of processing steps.
  • display ver ⁇ applies for a screen for displaying information or an information processing device with a screen.
  • an information processing device with a screen can be applied to parts of an information element processing steps tillar ⁇ processed by this device.
  • a pure display can represent a processing function to be applied to parts of an information element, the actual processing taking place in a computer connected to the display.
  • information and controls are increasingly displayed on displays. These displays are often not arranged in the driver's field of vision in such a way that they neither restrict the driver's field of vision nor avert the gaze from driving.
  • the arrangement of the representation of information about different functions of a motor vehicle is often individually adjustable by the user.
  • FIG. 1 shows an exemplary representation of a known from the prior art, configurable by a user interface on a screen 100.
  • a plurality of windows 102, 104, 106 are arranged, representing the status of ⁇ different, parallel programs or functions ,
  • Window 102 is associated with a radio function and displays information about a tuned radio station.
  • the in window 102 information includes the name of the radio station, 1021, the frequency at which the radio station broadcasts, 1022, a transmitter logo, 1023, and a QR code, by means of which further information is made accessible using a smartphone or other suitable network device.
  • Window 104 is associated with a navigation system of a motor vehicle and displays in an electronic compass 1041 the current orientation of the vehicle, as well as a destination address 1042.
  • Window 106 is associated with a weather application and displays weather data 1061 and temperature data 1062 to a selected location 1063. In addition, current date 1064 and current time 1065 are displayed in window 106.
  • a user wishes to move the window 106 to another location on the screen 100.
  • Arrow 108 in FIG. 1 indicates a displacement direction for window 106.
  • a user moves the window 106 from the upper right quadrant to the lower left quadrant.
  • the original position of the window 106 is indicated by the weak-light gray ⁇ rendered image of the window 106.
  • the shift occurs, for example, by touching the window 106 on a touch screen, e.g. With a finger, and moving the touch point across the screen to the new desired position without lifting the finger off the screen.
  • the desired new position is reached, the user can lift his finger off the screen and the window shift is completed.
  • an approximate positioning of a window to be moved is sufficient, with final positioning being automatic with respect to other windows.
  • Figure 2 shows the screen 100 after the shift.
  • the positions of the windows 102 and 104 are unchanged.
  • Window 106 is now at the new position in the lower left Quadrant, the original position of the window 106 is now empty.
  • the information displayed in the windows is essentially unchanged.
  • a similar shift function for information windows of computer programs from a display arranged at a first location to a display arranged at a second location is known from EP 2 018 992. Moving information windows from one display to another is merely a matter of continuing to play or play on another display.
  • the invention defined in the claims simplifies the operation of functions provided in a motor vehicle having a plurality of displays and reduces the complexity of the operating steps required for their operation.
  • the inventive method enables a user Informa tion elements ⁇ devices and / or functions in a simple and intuitive way to allocate for the processing, wherein a deflection is prevented from driving process or at least reduced.
  • the inventive method for triggering at least one to be applied to an item of information about the pure Dar ⁇ position associated with the information element information on a display beyond machining step is carried out in a system comprising a plurality of displays and min ⁇ least a processor coupled to the display computer to execute of the at least one processing step.
  • the method comprises the presentation of a first information element or a hint representing the presence of a first information element.
  • Such an indication can be made visually, acoustically or haptically.
  • One an optical indication is, for example, a light signal or symbol appearing on the screen in the field of vision of a user.
  • the type of light signal, z As color, steady light or flashing, or a combination thereof can provide information about the information element.
  • An indication by a symbol on a screen can give information about the information element by the design of the symbol.
  • An audible indication may be given by one or more user-audible beeps, or by voice information, such as synthesized voice output.
  • a haptic indication can be made, for example, by vibration of a control element touched by the user.
  • the control element does not have to be connected with the processing of the information element. For example, in a motor vehicle, a part of the steering wheel encompassed by a driver's hand can vibrate, or the seat surface or seat back.
  • the method further comprises detecting a gesture executed by the user with respect to the presented first information item or hint.
  • the gesture may be performed on a touch-sensitive surface, such as on a touchpad or a touch-sensitive screen.
  • the gesture can be performed freely in the room, for example, by hand, arm, Kopfbe ⁇ movements or the like.
  • the detection of gestures executed freely in space can be done, for example, by means of one or more cameras with subsequent image processing.
  • an object to be moved by the user in the room may be provided, for example a ring or a bracelet.
  • the object can be designed so that it recognizes changes in its position in the room itself and detects gestures.
  • a gesture detected by the object can then be transmitted to a control unit by means of wireless transmission, for example by radio or light signal, and used in the sense of the method according to the invention.
  • a detection of gestures by means of a camera is not absolutely necessary.
  • optional combination of both detection methods may improve the accuracy of detection.
  • Gestures can be, for example, pointing, gripping or swiping gestures.
  • the method further comprises executing one or more processing steps concerning the first information item, wherein the one or more processing steps to be performed depend on the detected gesture and the type or type of the information item.
  • the execution of the one or more processing steps may include assigning the information element or outputting a result of the processing steps to a display, wherein a computer connected to the display executes the processing steps.
  • the display stands for a function or a software program which executes the processing steps relating to the information element.
  • the processing steps applied to the Informati ⁇ onselement more than the mere representation of the information element itself on another display are.
  • the gesture describes picking up the information element at a first location and moving the information element to a second location, or more generally a displacement of the information element from a first location to a second location.
  • a location preferably corresponds at least approximately to the position of a display.
  • a place can also be a virtual place, which is clearly described by the direction of a gesture, for example.
  • a virtual location is a virtual recycle bin.
  • a move to this virtual recycle bin then causes the deletion of an information item or the abort of a current item related to the item of information.
  • the gesture may also be a shift of the Informati ⁇ onselement representing the object on a display in a be from several spatial directions. It is not absolutely necessary to touch the display.
  • the gesture may also be an independent hand, eye or head movement from an object displayed on a display.
  • Such a hand, eye or head movement occurs in close temporal relation with a signaling to a user that a new information element is ready.
  • the close temporal relationship is determined, for example, by a first time period following the signaling.
  • the first period may be preset for certain types of information elements and / or individually adjustable or modifiable by a user.
  • the related to an information element executed Be ⁇ processing steps are different depending on the kind or type of Informati ⁇ onselements, even if different kinds or types of information elements are assigned the same screen or the same gestures are used.
  • the at least one processing step to be executed on the information element is linked to the information element.
  • the link may be imp ⁇ lementiert in different ways.
  • one or more computer readable program instructions may be associated with the information element, for example, in a file header.
  • assign one or more processing steps to certain types of information elements and to store the assignment in the at least one computer arranged in the system, so that only the type or type of information element must be determined by the one or more to determine a specific gesture toward executed Bear ⁇ beitungsuzee.
  • Fig. 1 shows a first exemplary illustration of a user interface known from the prior art, configurable by a user on an image screen ⁇ ;
  • FIG. 2 shows a second exemplary representation of the known from the prior art, configurable by a user interface on a screen ⁇ screen;
  • FIG. 3 shows an exemplary schematic representation of an arrangement of a plurality of devices functionally linked to each other according to the invention
  • Fig. 4 shows an illustrating the method from ⁇ strah Of schematic arrangement according to the invention interconnected devices of Figure 3.
  • FIG. 3 shows an exemplary schematic representation of an arrangement of a plurality of devices functionally linked together according to the invention, for example in a motor vehicle cockpit.
  • the motor vehicle cockpit 300 has an instrument panel 302 in front of which a steering wheel 304 is arranged.
  • a combination ⁇ instrument 306 is arranged on the dashboard 302 having a first display.
  • To the right of the position indicated by the steering wheel 304 of the driver is a center console 308, on which a second display 310 is arranged.
  • a windshield 312 On which a Pro etechnischs
  • a head-up display 314 is indicated in the lower area of the center console.
  • a virtual recycle bin 316 is indicated.
  • the combination instrument 306 with the first display, the second display 310, the head-up display 314 and the virtual wastebasket 316 represent devices which, in addition to being able to display information, can perform or control certain functions.
  • the devices described in FIG. 3 are arranged along a horizontal or vertical axis in order to clarify the functional principle of the method according to the invention.
  • the axes are parallel to corresponding gestures that trigger processing steps to be applied to information items.
  • the exemplary arrangement of the devices in Figure 4 may be considered to be spatially approximate as a typical arrangement of corresponding devices in a motor vehicle.
  • the first instrument, instrument cluster 306, is located in the left half of the figure on the x-axis (not shown).
  • the second device, second display 310 is arranged in the right half of the figure on the x-axis.
  • Display 314 is located in the upper half of the figure on the y-axis (not shown), and the fourth device, the virtual wastebasket 316, is located in the lower half of the figure on the y-axis.
  • the information element to which one or more processing steps are applied is an in-depth one
  • SMS Text message
  • a message eg. For example, an icon or a text hint that an SMS has been received is for example displayed on the display of the instrument cluster 306.
  • the reading of text messages is usually not possible for drivers of motor vehicles while driving, without taking their eyes off the road and possibly endanger themselves or other road users. Therefore, reading text messages while driving is prohibited by law in many countries.
  • a driver can activate a read-only function by means of a simple gesture, here by means of a directed swipe gesture to the right, which reads out the SMS.
  • a simple gesture here by means of a directed swipe gesture to the right
  • Corresponding devices that convert text to speech are known from the prior art and will not be explained in detail here.
  • the swipe gesture from left to right, from the instrument cluster 306 to the second display 310 in the center console 308, causes the assignment of the text message whose arrival has been signaled on the display of the instrument cluster 306, to another device and simultaneously starting the text-to-speech Function and its application to the text message.
  • a speech dialog program which offers the driver a choice of reactions to the text message, for example the dictation of a response, or the establishment of a telephone connection to the sender of the SMS.
  • the driver can perform a swipe gesture from top to bottom, ie from the instrument cluster 306 in the direction of virtual wastebasket 316, to clear the notification in the display of the instrument 306.
  • the SMS may be deleted, e.g. For example, if the driver does not want to receive messages from a particular sender, or if the content recognizable to the driver is unimportant. This assumes that the sender of a message and / or content summary are made visible in the notification.
  • Text messages are not limited to SMS. Any text messages can be processed according to the method explained above, for example from social networks, news services, emails, etc.
  • the executed in response to a gesture editing ⁇ steps and the available gestures may be dependent and may be stored in a control unit, which performs the gesture recognition or which first receives the information element or be riding ⁇ represents the type of information element.
  • the devices exchange communication-compatible attributes for the correct control of the function.
  • processing steps can be transmitted in the form of executable by a computer program commands, for example in a script ⁇ language.
  • the information element to which one or more processing steps are applied is a traffic announcement or traffic information, e.g. B. concerning a currently traveled or planned route.
  • a corresponding information is displayed on the display of the instrument cluster 306 or the Ver ⁇ traffic announcement is received by a radio receiver of a media and communication unit and reproduced.
  • the media and communication unit is in motor vehicles usually arranged in the center console second display 310 assigned ⁇ , and the control of the relevant functions is at least displayed in the second display 310. Also becomes an input a navigation target usually via the arranged in the middle ⁇ console second display 310 done.
  • the acquisition of traffic information causes the display in the instrument cluster 306 in a navigation system and recalculate the route and calculating an alternative route based transmitted by the information element over ⁇ data. If no route is planned, a probable route can also be estimated and information about an expected time delay can be output on the probable route. Accordingly, a swipe gesture from top to bottom, that is to say to the virtual wastebasket 316, causes discarding of the traffic information without recalculation of the route or other actions.
  • swipe gestures in the foregoing examples may be supplemented by a corresponding animation of the item of information represents ⁇ alternate forming image or icon, Example ⁇ , by a displacement of the image or icon on the display of the device or, in the case of the swipe gesture from the top is located in the wiping direction downwards only the pushing out of the picture or pictogram from the display in wiping direction.
  • the information element to which one or more processing steps are applied is an address or a destination to which route guidance by means of a navigation device is to take place.
  • the input of an address or destination to which a route towards ⁇ guide is to be made via the arranged in the center console second display 310 is usually.
  • the second display 310 also provides a selection of available functions for selection as a rule.
  • a swipe gesture causes from bottom to top, So from the second display 310 to the head-up display 314, the acquisition of the destination information in a corresponding Naviga ⁇ tion program and the calculation of the route.
  • the route guidance information is then displayed at any given time in the head-up display 314, possibly supplemented by corresponding voice instructions.
  • gestures available for an information element additionally depend on operating parameters of a motor vehicle.
  • a representation of the message is not provided, because reading texts while driving can lead to dangerous situations.
  • the motor vehicle is stationary, however, such as St ⁇ the engine being switched, it is possible to expand the available for a text message information element representing gestures.
  • a swipe gesture from bottom to top ie from the display in the instrument cluster 306 or from the second display 310 to the head-up display 314, could then cause the representation of the text in the head-up display.
  • the home position for gesture recognition is essentially irrelevant as long as a gesture can be clearly recognized. It is therefore not necessary to start a gesture in spatial proximity to a first device or its display and to close it in spatial proximity to a second device or its display in order to effect an assignment of the information element to the second device for processing.
  • the invention has been described above with reference to a motor vehicle. However, it is also possible to apply the method according to the invention to other systems which can process different information elements in different ways and in which gestures allow a clear assignment of information elements to devices or subsystems.
  • Such a system is for example a crosslinked Compu ⁇ tersystem with one or more computers or subsystems and corresponding displays.
  • the computers do not have to be identical. Rather, for example, a tablet computer represent a central unit that receives information elements or references to it, but limited in their ability to process certain information elements.
  • On a touch screen of the tablet computer being led ⁇ detected or via a built-in camera tablet computer gestures cause allocation of the information elements to another suitable computer for further processing. It is also conceivable that a gesture which is identical in itself causes an allocation to other devices, computers or subsystems through a changing orientation of the tablet computer in a room.
  • a computer for the execution of processing steps is provided and connected to a plurality of displays.
  • the computer is further connected to at least one means for recognizing gestures carried out by a user.
  • the computer provides the function of a control unit provided, which applies a detected gesture on an information element and as a function of the gesture and the Informati ⁇ onselement the execution of one or more of information, the on processing relevant processing steps triggers by an ent ⁇ speaking program running on the computer.

Landscapes

  • Engineering & Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un procédé de déclenchement d'au moins une étape de traitement à appliquer à un élément d'information, en partant d'une représentation d'une information liée à l'élément d'information sur une unité d'affichage, dans un système ayant plusieurs unités d'affichage et au moins un ordinateur relié aux unités d'affichage. Ledit procédé comprend la présentation d'un premier élément d'information ou d'un avis représentant la présence d'un premier élément d'information. Le procédé comprend en outre la détection d'un geste exécuté par l'utilisateur en relation avec le premier élément d'information présenté ou l'avis présenté, et, en fonction du geste détecté, l'exécution d'une ou plusieurs étapes de traitement concernant le premier élément d'information.
PCT/EP2016/068154 2015-08-24 2016-07-29 Procédé de déclenchement d'au moins une étape de traitement par attribution d'un élément d'information à un appareil Ceased WO2017032549A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102015216108.0A DE102015216108A1 (de) 2015-08-24 2015-08-24 Verfahren zum Auslösen mindestens eines Bearbeitungsschrittes durch Zuweisung eines Informationselements zu einem Gerät
DE102015216108.0 2015-08-24

Publications (1)

Publication Number Publication Date
WO2017032549A1 true WO2017032549A1 (fr) 2017-03-02

Family

ID=56555394

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2016/068154 Ceased WO2017032549A1 (fr) 2015-08-24 2016-07-29 Procédé de déclenchement d'au moins une étape de traitement par attribution d'un élément d'information à un appareil

Country Status (2)

Country Link
DE (1) DE102015216108A1 (fr)
WO (1) WO2017032549A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110709273A (zh) * 2017-06-07 2020-01-17 奥迪股份公司 用于运行机动车的显示装置的方法、操作装置和机动车

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108237919B (zh) * 2018-01-10 2019-12-24 广州小鹏汽车科技有限公司 一种汽车中控大屏的辅助显示方法和系统
JP7056284B2 (ja) * 2018-03-20 2022-04-19 トヨタ自動車株式会社 車両用表示装置、画面制御方法及びプログラム
IT202200004004A1 (it) * 2022-03-03 2023-09-03 Automobili Lamborghini Spa Metodo di gestione e veicolo

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2018992A1 (fr) 2007-07-27 2009-01-28 Continental Automotive GmbH Cockpit de véhicule automobile
WO2014108153A1 (fr) * 2013-01-08 2014-07-17 Audi Ag Procédé de synchronisation de dispositifs d'affichage d'un véhicule automobile
WO2014146925A1 (fr) * 2013-03-22 2014-09-25 Volkswagen Aktiengesellschaft Système de reproduction d'information pour un véhicule et procédé de mise à disposition d'information pour l'utilisateur d'un véhicule

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10029700B2 (en) * 2012-12-21 2018-07-24 Harman Becker Automotive Systems Gmbh Infotainment system with head-up display for symbol projection

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2018992A1 (fr) 2007-07-27 2009-01-28 Continental Automotive GmbH Cockpit de véhicule automobile
WO2014108153A1 (fr) * 2013-01-08 2014-07-17 Audi Ag Procédé de synchronisation de dispositifs d'affichage d'un véhicule automobile
WO2014146925A1 (fr) * 2013-03-22 2014-09-25 Volkswagen Aktiengesellschaft Système de reproduction d'information pour un véhicule et procédé de mise à disposition d'information pour l'utilisateur d'un véhicule

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
CARJAM TV: "BMW 7 Series 2016 Hand Gesture Controls Are Cool Commercial HD CARJAM BMW 7er G11 G12 INTERIOR", 28 April 2015 (2015-04-28), pages 1 - 2, XP054976796, Retrieved from the Internet <URL:https://www.youtube.com/watch?v=U4YJTl3k0jA> [retrieved on 20160923] *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110709273A (zh) * 2017-06-07 2020-01-17 奥迪股份公司 用于运行机动车的显示装置的方法、操作装置和机动车
CN110709273B (zh) * 2017-06-07 2023-03-28 奥迪股份公司 用于运行机动车的显示装置的方法、操作装置和机动车

Also Published As

Publication number Publication date
DE102015216108A1 (de) 2017-03-02

Similar Documents

Publication Publication Date Title
EP3373123B1 (fr) Procédé et dispositif de présentation des actions de commande recommandées d&#39;un système d&#39;appel de propositions et interaction avec le système d&#39;appel de propositions
EP2930049B1 (fr) Interface utilisateur et procédé d&#39;adaptation d&#39;une vue sur une unité d&#39;affichage
DE112014000351T5 (de) Kontextbasierte Fahrzeug-Benutzerschnittstellen-Rekonfiguration
EP3067244B1 (fr) Vehicule avec mode de conduite s&#39;adaptant automatiquement a la situation
DE102014013960A1 (de) Verfahren zum Betreiben wenigstens einer Fahrerassistenzeinrichtung eines Kraftwagens und System mit einer Fahrerassistenzeinrichtung
EP1853465B1 (fr) Procede et dispositif de commande vocale d&#39;un appareil ou d&#39;un systeme dans un vehicule automobile
WO2007090593A1 (fr) Dispositif et procede pour l&#39;edition interactive d&#39;informations et/ou l&#39;assistance a l&#39;utilisateur d&#39;un véhicule
DE102019217730A1 (de) Verfahren zum Betreiben eines Bediensystems in einem Fahrzeug und Bediensystem für ein Fahrzeug
EP3063611A1 (fr) Dispositif et procédé d&#39;adaptation du contenu d&#39;une barre d&#39;état
EP2802963A1 (fr) Procédé et dispositif de commande de fonctions dans un véhicule à l&#39;aide de gestes effectués dans l&#39;espace tridimensionnel ainsi que produit-programme d&#39;ordinateur correspondant
WO2017032549A1 (fr) Procédé de déclenchement d&#39;au moins une étape de traitement par attribution d&#39;un élément d&#39;information à un appareil
WO2015101467A1 (fr) Module de commande d&#39;affichage et procédé d&#39;affichage d&#39;indications supplémentaires sur un module d&#39;affichage
EP2941685B1 (fr) Procédé de commande et système de commande pour véhicule
EP2924551A1 (fr) Procédé et dispositif de préparation d&#39;une interface utilisateur graphique dans un véhicule
DE102013016196B4 (de) Kraftfahrzeugbedienung mittels kombinierter Eingabemodalitäten
DE102012218155A1 (de) Erleichtern der Eingabe auf einer berührungsempfindlichen Anzeige in einem Fahrzeug
DE102014209983B4 (de) Verfahren zum Koppeln einer Bedieneinrichtung mit einer mobilen Einheit, Bedieneinrichtung und mobile Einheit
WO2017140569A1 (fr) Dispositif de commande d&#39;un véhicule automobile et procédé de fonctionnement d&#39;un dispositif de commande pour provoquer un effet de changement entre un plan virtuel de représentation et une main
DE102013014877A1 (de) Kraftfahrzeug-Bedieneinrichtung mit selbsterklärender Gestenerkennung
EP3108333B1 (fr) Interface utilisateur et procédé d&#39;assistance d&#39;un utilisateur lors de la commande d&#39;une interface utilisateur
DE102013003047A1 (de) Verfahren und System zum blickrichtungsabhängigen Steuern einer Funktionseinheit
WO2023174595A1 (fr) Moyen de transport, et dispositif et procédé pour délivrer des informations à un utilisateur d&#39;un moyen de transport
DE102019208648B4 (de) Kraftfahrzeug
EP3093182B1 (fr) Moyen de deplacement, machine de travail, interface utilisateur et procede d&#39;affichage d&#39;un contenu d&#39;un premier dispositif d&#39;affichage sur un second dispositif d&#39;affichage
DE102019131944A1 (de) Verfahren zur Steuerung zumindest einer Anzeigeeinheit, Kraftfahrzeug und Computerprogrammprodukt

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16745108

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16745108

Country of ref document: EP

Kind code of ref document: A1