[go: up one dir, main page]

WO2021028274A1 - Système utilisateur et procédé pour faire fonctionner un système utilisateur - Google Patents

Système utilisateur et procédé pour faire fonctionner un système utilisateur Download PDF

Info

Publication number
WO2021028274A1
WO2021028274A1 PCT/EP2020/071934 EP2020071934W WO2021028274A1 WO 2021028274 A1 WO2021028274 A1 WO 2021028274A1 EP 2020071934 W EP2020071934 W EP 2020071934W WO 2021028274 A1 WO2021028274 A1 WO 2021028274A1
Authority
WO
WIPO (PCT)
Prior art keywords
input unit
operating
shape
activated
operating state
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/EP2020/071934
Other languages
German (de)
English (en)
Inventor
Sarah Brauns
Eva Berner
Carsten Temming
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Volkswagen AG
Original Assignee
Volkswagen AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Volkswagen AG filed Critical Volkswagen AG
Publication of WO2021028274A1 publication Critical patent/WO2021028274A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/10Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/21Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
    • B60K35/23Head-up displays [HUD]
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/25Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using haptic output
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/28Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor characterised by the type of the output information, e.g. video entertainment or vehicle dynamics information; characterised by the purpose of the output information, e.g. for attracting the attention of the driver
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/128Axially displaceable input devices for instruments
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/143Touch sensitive instrument input devices
    • B60K2360/1434Touch panels
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/177Augmented reality
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/014Force feedback applied to GUI

Definitions

  • the present invention relates to an operating system, in particular for a vehicle, and a method for operating the operating system.
  • AR augmented reality
  • VR virtual reality
  • augmented reality methods for example, a perception of reality in a user's environment is supplemented by virtual objects generated by means of a computer, which for example output additional information or provide options for interaction.
  • augmented reality and virtual reality are used as essentially analog, since no clear distinction is often made in common linguistic usage.
  • AR display devices such as glasses, contact lenses or projection solutions will be available everywhere.
  • AR display devices such as glasses, contact lenses or projection solutions
  • road traffic such as pedestrian navigation
  • a manually operable control element in particular a steering wheel or a center console, is no longer absolutely necessary, which means that an alternative interaction path is required, for example using a handheld device.
  • a device for supplementing an instrument panel by means of augmented reality is known. Functions can be activated and certain information displayed based on the direction of view of a driver of a vehicle. Inputs can be recorded, for example, using a touchscreen or voice control.
  • DE 10349673 A1 proposes a device for entering data in a motor vehicle, in which the data entered are displayed in the driver's field of vision.
  • the input takes place, for example, by means of a touch panel for handwritten input.
  • KR 10-2013-0036934 a head-up display is described in which the display of a cell phone is projected into the driver's field of vision.
  • the present invention is based on the object of providing an operating system and a method for its operation which allow particularly simple and socially acceptable operation.
  • the operating system comprises a control unit which is set up to generate output data for a graphical user interface, an output unit for outputting the output data and an input unit for detecting an actuation action. At least a first and a second operating state can be activated for the user interface.
  • the control unit is set up to generate a shape control signal as a function of the activated first or second operating state of the user interface and to transmit it to the input unit, the input unit having a deformation element which is set up to use the shape control signal to generate a first Form the input unit when the first Operating state is activated, and to form a second geometric shape of the input unit when the second operating state is activated.
  • the user advantageously receives clear, in particular also haptically detectable, feedback about the operating state of the user interface by means of the input unit.
  • new modes of operation can be provided, for example in order to be able to operate in different contexts. This is particularly important when used in the area of augmented reality, since the user should enter the experience of a virtually generated or expanded environment as seamlessly as possible, in which interactions with operable elements can follow very different operating modes.
  • certain initial functions such as scrolling or zooming, can be carried out by means of the input unit, without first having to perform a separate operating step through which, for example, a “tool” or a mode of operation can be selected got to.
  • the user interface includes, in particular, a graphic representation for a human-machine interface.
  • Technical facilities can be operated by means of control elements, for which purpose buttons or symbols of the display can be used.
  • the user interface can include switching and control elements which represent the operation of a functionality in a manner that can be grasped by a person. For example, the amount of a parameter can be displayed and its setting can be visualized by a setting element.
  • the user interface can also include elements for displaying information and thus enable output that can be interpreted by a human being. In particular, it is designed as a graphical user interface in a manner known per se.
  • the deformation element of the input unit has a position relative to further elements of the input unit, and the relative position can be set in order to form the first or two geometric shape.
  • the shape can advantageously be changed particularly easily.
  • the deformation element when setting the geometric shape of the input unit, the shape of its surface is set. Furthermore, the position of the center of gravity of the input unit can be adjusted.
  • the deformation element can comprise an actuator or be coupled to an actuator.
  • the relative position of the deformation element affects his Arrangement relative to the other elements of the input unit.
  • the position can be adjusted, for example, by moving and / or rotating the deformation element.
  • the deformation element is designed in particular in such a way that its extent changes in at least one direction.
  • the deformation element can be elongated and change its length in the longitudinal direction.
  • the operating state of the user interface can be defined in various ways. For example, it can relate to the availability of certain operable functionalities, so that an operating state is activated when the functionalities are available and can be operated by means of an actuation action.
  • the operating status can relate to the user interface as a whole or to individual elements or objects.
  • the operating status can also be defined by the status of an object within the user interface.
  • This object can be selected, for example, either automatically or by means of a selection operator action.
  • the shape control signal is designed so that the shape of the input unit is formed depending on the activated state of the object.
  • the operating state can also be defined as a function of an active operating mode. For example, operation by scrolling along a certain direction or axis can be possible and the operating mode can be defined as a function of this direction or axis. In a further example, the operation can be carried out by means of a touch gesture, by moving or exerting pressure on a specific area of the input unit.
  • the shape of the input unit can be set in such a way that it is possible to detect which operating mode is currently active, for example to operate a specific object or the user interface.
  • an operating state can be defined in which no interaction with an object on the operating surface can be carried out, for example in the case of a pure information display. Furthermore, an operating state can be defined in which a specific input is requested, for example to confirm, select an option or press a button.
  • the first operating state can be a display state and the second operating state can be a configuration state.
  • the display status can be defined in such a way that a status is output for a specific object on the user interface certain information is activated.
  • the configuration state can be defined in such a way that this object of the user interface is designed in such a way that settings can be made.
  • the information output in the display state can be selected or otherwise configured.
  • the second operating state can also be a configuration state for the user interface as a whole, whereby, for example, objects can be selected that are included in the user interface, or parameters can be set using the user interface or the objects it comprises.
  • the input unit is set up to generate an operating state change signal when a change in position of the input unit is detected, for example a rotation of the input unit, wherein if the change in position includes a rotation or pivoting about a certain angle, the angle with a certain threshold value is compared. If the angle exceeds the threshold value, an operating state change signal is generated for changing the operating state.
  • the operating state change signal can be generated when the input unit is turned around a specific axis.
  • the output data can include a graphic object indicating the geometric shape of the input unit and / or the activated operating state.
  • the output includes a graphic object, the geometry of which corresponds at least schematically to a geometry of the shape of the input unit.
  • the operating system is set up in such a way that a control signal is generated upon the detection of an actuation action.
  • This can be transmitted to the control unit and, for example, lead to an interaction with the user interface or an object encompassed by it.
  • an object can be selected and a function assigned to the object can be activated or deactivated.
  • an adjustable parameter can be transmitted to a device for setting.
  • the shape of the input unit can be changed by means of an actuation, in particular by an external force, and depending on the changed shape, the first or second operating state of the input unit can be activated.
  • the position and / or alignment of the deformation element is changed.
  • the operating state can advantageously be set particularly easily.
  • the activation of the operating state can in particular relate to the entire user interface or to an object that it comprises.
  • the graphical representation of the user interface or an object it encompasses can be changed. For example, a scrolling direction is changed, for example by tilting a representation of a scrollable list of list entries.
  • the shape of the input unit can be changed both actively by means of the deformation element and on the basis of the shape control signal, as well as passively by physical action from the outside, in particular by a force acting on the input unit or on the deformation element.
  • the position of the deformation element of the input unit can be changed from the outside by moving and / or rotating.
  • the shape can be changed by exerting pressure on a certain area of the input unit, for example in the case of a flexible area which is designed to be deformable and whose deformation can also be detected.
  • a display or configuration mode is set, for example. That is, by changing the shape of the input unit by means of an external force, the operating state can be changed.
  • the input unit comprises at least one sensor element for detecting a change in shape and / or a force acting on the input unit.
  • the sensor element is particularly suitable for detecting an actuation action by means of different physical detection principles. For example, touching or approaching an area of the surface of the input unit can be detected, for example by means of a capacitive sensor.
  • a position of the input unit can be detected, for example by means of a gyroscopic sensor or an acceleration sensor or a sensor for determining the position.
  • An actuation action that can be detected by means of the input unit can take place in various ways.
  • the actuation action is carried out in such a way that it has a specific relation to an object of the user interface or to a functionality linked to the object.
  • An actuation action can include, for example, touching an area on the surface of the input unit, exerting pressure on the input unit or a specific orientation or positioning of the input unit.
  • You can also change the position of the Include touch, for example a swiping gesture that is executed on the surface of the input unit, or a change in position and / or orientation, for example by moving the input unit along a trajectory, with the speed also being able to be taken into account.
  • a three-dimensional gesture is carried out.
  • Actuation handling can include rotating or pivoting, a specific orientation, for example to point to a specific position or in a direction.
  • the actuation action can furthermore comprise a combination of several action elements, for example a simultaneous movement along different degrees of freedom or successive movements and further actuations.
  • the input unit can in particular be designed so that a feedback is generated when the actuation action is carried out, for example through a haptically detectable elastic deformation of the input unit and / or an acoustically perceptible click during actuation, for example when a pressure is exerted on the surface of the input unit.
  • the actuation action comprises a movement of the input unit along a specific trajectory, an acceleration, a change in position, a touch or a swiping gesture along a surface of the input unit.
  • the actuation is thereby advantageously particularly simple.
  • the input unit is brought into a certain position during the actuation action, for example an alignment and / or positions of the input unit being detected in order to perform a pointing gesture, for example.
  • an alignment in a gravitational field can be detected, for example.
  • An actuation action can also relate to a change in position, in particular as a function of time, for example moving the input unit along a trajectory in a two- or three-dimensional space, executing a specific acceleration and / or a rotation or a tilting movement.
  • a specific surface area of the input unit is designed as an interaction surface, wherein a contact by means of an actuation object can be detected in the specific surface area.
  • the actuation object is a hand of a user.
  • the interaction surface can be configured in a manner known per se and in particular comprise a sensor element for detecting a touch or approach, for example by means of capacitive or resistive sensors. For example, the position of a contact and, if necessary, a change in the position as a function of time is recorded.
  • a direction, trajectory and / or speed of the movement can be recorded.
  • the input unit is designed in particular in such a way that a variable surface area can be activated as an interaction surface.
  • the input unit can have boundary surfaces which can be fully or partially activated as interaction surfaces.
  • the input unit can have a polyhedral shape for this purpose.
  • the interaction surface is formed in particular as a function of the activated operating state, for example together with the change in the shape of the input unit.
  • the specific surface area can be highlighted by means of an optical highlight.
  • the surface area of the interaction surface is emphasized, for example, by means of a brightness or color that is different from the surroundings.
  • the surface area can also be highlighted optically, for example by changing the shape of the input unit in such a way that the interaction surface can be detected haptically, for example as a raised area of the surface.
  • the input unit is wirelessly coupled to the control unit and designed to be movable by a user.
  • the input unit can be used without a fixed connection to another unit. The input can thereby advantageously take place in a particularly flexible manner.
  • the input unit has dimensions that enable the user to hold it in one hand.
  • the input unit can also have a grip area that makes it easier to hold.
  • the shape of the input unit can be designed in such a way that holding it in a certain orientation relative to the hand of the user is favored, for example in the case of an elongated shape which favors an orientation essentially at an angle to the course of the user's forearm.
  • the input unit can furthermore be designed such that it is not larger than 10 cm in any direction, preferably not larger than 7 cm, more preferably not larger than 5 cm.
  • a receiving unit can be provided for the input unit in which, for example, the input unit can be put down in such a way that unintentional slipping or falling is avoided.
  • a receiving unit can also be used to charge an energy store of the input unit. For example, an inductive method can be used for charging, or an electrically conductive connection can be established between the input unit and the receiving unit.
  • the user interface When outputting the output data, the user interface is displayed, which can also be done in a manner known per se.
  • a display area can be used for this.
  • means of expanded or virtual reality can be used for output, with elements of a real environment being enriched with virtually generated elements, for example.
  • the output can comprise a graphic output of image or video data which reproduce the real environment, with additional graphic elements being generated and included in the output.
  • the output can be generated and output in such a way that a user perceives the real environment directly, for example through a vehicle window or glasses, while at the same time the output is projected into his field of vision or into his eye, for example in a head-up display or in the case of glasses with a corresponding display or projection unit, that the virtual objects appear in the environment from the perspective of the user.
  • the output unit is designed such that the output data can be output at a distance from the input unit. A particularly flexible output can thereby advantageously take place.
  • the output does not take place by means of a so-called touchscreen, in which inputs are recorded in the same area in which the output takes place.
  • a touchscreen for output can be combined with an input unit spaced therefrom.
  • the output unit can be designed in such a way that the output takes place with means of augmented or virtual reality.
  • the operating system can furthermore comprise a combination of different output units, for example a combination of a touchscreen with an output unit for augmented reality.
  • the output unit comprises a head-up display or a display unit worn on the body of a user, for example glasses.
  • the method can thereby advantageously be combined particularly easily with known methods of augmented or virtual reality.
  • a direction of view of a user can be recorded, with a real subset of operating objects of the user interface being selectable as a function of the recorded direction of view.
  • a subsequent actuation action leads to a selection or actuation of an individual operating object of the subset. This advantageously facilitates a selection between a number of control objects on the user interface.
  • the direction of gaze can be detected in a manner known per se, in particular by following the movement of an eye and / or a head posture.
  • the selection of the subset of operating objects takes place in such a way that operating objects arranged only in a certain sub-area of the user interface are selected, the sub-area being defined, for example, by the viewing direction and a predetermined radius or an environment defined in some other way. In particular, a preselection is thereby carried out.
  • the input unit comprises an actuator for generating a vibration.
  • the control unit is set up to generate a feedback signal and to transmit it to the input unit, a vibration being output as feedback for an actuation action or an event of the user interface on the basis of the feedback signal.
  • the input unit vibrates in order to confirm that an actuation action has been detected, in which case the feedback signal is formed with a vibration that is dependent on the type of actuation action detected.
  • a vibration can also be used as a feedback signal to signal that a specific control signal has been generated, for example an actuation or selection of an object on the user interface or a setting of a specific value.
  • the vibration can be formed in a manner known per se.
  • vibrations of different intensity or frequency and possibly with different vibration directions can be output in order to generate different feedback signals.
  • a feedback signal can be generated such that the greater the set value of the parameter, the greater the intensity of the vibration.
  • the lower the set value the weaker the vibration.
  • the input unit is assigned to a user.
  • the input unit can comprise a memory element on which settings and / or other personal data of a user are stored, or it can comprise identification information by means of which the user can be identified and settings and other data can be recorded.
  • a high degree of personalization can advantageously be achieved and the input unit can be designed as a user's personal device. The user can then operate in the same or a similar manner in different contexts, in different positions and for different operating systems, for example by actuating virtual objects in an analogous manner or arranging virtual objects in a similar manner.
  • the input unit can be designed as a mobile device assigned to a user.
  • the user can carry this with him, for example, when he starts a vehicle or to operate another operating system, for example in the area of home automation or in public spaces. He is then advantageously already familiar with the operation and can carry it out using an individually configured input unit.
  • the operating system allows interaction with a user interface of augmented reality in various ways.
  • a gesture control can be carried out as an actuation action, for example, wherein a gesture can be detected using a touch-sensitive surface of the input unit and / or using sensors to determine the position, speed or acceleration of the input unit in three-dimensional space. While gestures carried out with great amplitude and clarity in public space can attract attention and meet with rejection, the actuation action in the operating system can be carried out discretely by means of small movements and / or using a small-sized input unit.
  • an operation is implemented in which a preselection of objects of the user interface is carried out based on the direction in which the user is looking, which objects are then individually selected or operated by an operating action using the input unit, for example by scrolling, rotating, or swiping or repeated actuation.
  • This makes use of the fact that the selection is particularly simple on the basis of the viewing direction, although a detailed and precise selection or determination of the viewing direction can be difficult. This more precise selection is therefore made by means of the input unit, the preselection based on the viewing direction saving the user a greater number of operating steps here.
  • voice control can be used alternatively or additionally in the operating system, for example to confirm an input or to carry out an actuation, preselection or selection. This avoids the disadvantages of pure voice control, which can only be used to a limited extent in public for reasons of privacy or due to disturbances in the environment.
  • output data for a graphical user interface are generated and output and an operating action is recorded.
  • At least a first and a second operating state can be activated for the user interface.
  • a shape control signal is generated and transmitted to the input unit, a deformation element using the shape control signal to form a first shape of the input unit when the first operating state is activated, and a second geometric shape Forms the input unit when the second operating state is activated.
  • the method according to the invention is designed to operate the operating system according to the invention described above.
  • the method thus has the same advantages as the operating system according to the invention.
  • Figure 1 shows an embodiment of the control system according to the invention in a vehicle
  • FIGS. 2A and 2B show exemplary embodiments of outputs that can be generated in the method according to the invention
  • FIGS. 3A and 3B show a first exemplary embodiment of a change in shape of the input unit of the operating system
  • FIGS third embodiment of a change in shape of the input unit of the operating system With reference to FIG. 1, an exemplary embodiment of the operating system according to the invention is explained in a vehicle.
  • the vehicle 1 comprises a control unit 2 to which an output unit 3, in the exemplary embodiment a head-up display 3, and an input unit 4 are coupled.
  • the input unit 4 comprises a sensor 5 and a deformation element 6.
  • a device 7 of the vehicle 1 is also coupled to the control unit 2, a navigation unit 7 in the exemplary embodiment.
  • the input unit 4 is assigned to a specific user and settings personalized for this user are stored for the input unit 4.
  • the storage takes place in a memory element of the input unit 4; alternatively or additionally, the input unit 4 can store an identification by means of which the corresponding settings can be called up when the input unit 4 is used.
  • FIGS. 2A and 2B Examples of outputs that can be generated in the method according to the invention are explained with reference to FIGS. 2A and 2B. This is based on the exemplary embodiment of the operating system according to the invention explained above, which is further specified by the description of the method.
  • the control unit 2 generates an output which is projected into the field of view of the driver of the vehicle 1 by means of the output unit 3, for example the head-up display 3 or augmented reality glasses, so that it is simultaneously with the view the surroundings of the vehicle 1 appear superimposed.
  • the output takes place by means of a different type of output unit 3, in particular a different output unit for virtual reality.
  • the entire display visible to the user can also be generated virtually, in particular in that a real view of the surroundings is displayed by means of the output unit 3.
  • the output in this case includes images of the surroundings captured by means of a video camera and virtual objects displayed in addition.
  • the real environment of the vehicle 1 is indicated by the lane markings 10, while the output unit 3 additionally projects virtual objects 11 to 15 into the field of view.
  • the output as virtual objects comprises an output of a speed limit 11, which is output as a representation of a traffic sign, one that is represented numerically current speed 12, a list 13 with list entries 13a, a symbol as a marker 14 and, at marker 14, an information box 15 with a representation of text.
  • the real environment 20, in particular a lane marking 20, is also indicated.
  • virtual objects 21, 22, 23 projected into the user's field of view are shown, which represent active widgets 21, 23 and inactive widgets 22.
  • a further virtual object 24 is shown, which represents a route display 24. These are arranged vertically one above the other along an arch structure in the right and left areas. Further virtual objects can also be displayed, for example information displays about other runners on the route.
  • the active widgets 21, 23 shown in FIG. 2B are shown enlarged compared to the inactive widgets 22. They can also be emphasized more clearly by means of graphic highlighting that is known per se.
  • text information is also displayed, in particular possible actions that can be carried out by actuating the respective active widget 21, 23.
  • an active wdget 21 is provided to control navigation along a running route, which is output by a virtual route display 24.
  • An actuation can, for example, end navigation or trigger another action.
  • virtual objects can be selected for interaction, for example by means of gaze control.
  • a viewing direction of the user is recorded and it is determined which virtual object the gaze is directed at.
  • This object is then selected, for example after the gaze has been continuously directed at the virtual object for a certain time interval.
  • Interactions can then be carried out by means of the input unit 4.
  • virtual objects can be actuated in order to develop or activate an action or functionality, or the display can be changed, for example by selecting and configuring content to be displayed.
  • the input unit 4 can be arranged in different ways; it can, for example, be in the hand of the user or be fixedly attached to a certain position in the vehicle 1.
  • a control signal for a device can be generated and output, for example for a navigation system 7 of the vehicle 1.
  • a preselection to be made based on the user's viewing direction. The direction of view is determined and the objects arranged there are preselected in a certain radius around a point within the user interface at which the eye is directed, which is in particular a real subset of the virtual objects of the entire user interface.
  • a specific virtual object can then be selected and / or activated by a further operating action, for example further directing the view to a specific point or by activating the input unit 4.
  • FIGS. 3A and 3B A first example of a change in shape of the input unit of the operating system is explained with reference to FIGS. 3A and 3B. This is based on the examples explained above.
  • the input unit 34 comprises a deformation element 36a, 36b, which is shown in a horizontal configuration 36a and in a vertical configuration 36b.
  • the deformation element can actively switch between these configurations 36a, 36b, but the configuration 36a, 36b can also be changed passively by forces acting on the deformation element 36a, 36b from the outside.
  • the input unit 34 is pentagonal, while the deformation element 36 is elongated and arranged centrally on the surface of the input unit 34.
  • Interaction areas 35a, 35b, 35c are formed on a surface of input unit 34.
  • One of the interaction surfaces 35c is formed on the deformation element 36a and follows its configuration. That is, with a horizontal configuration of the deformation element 36a, the interaction surface 35c is also elongated and aligned horizontally, with a vertical configuration of the deformation element 36b, the interaction surface 35c is also arranged vertically in the same way (not shown).
  • the interaction surfaces 35a, 35b, 35c are designed as touch-sensitive surface areas of the input unit 34.
  • the input unit 35 is formed such that it can be slightly elastically deformed in the region of one of the interaction surfaces 35a by pressure perpendicular to the surface, in particular in the manner of a push button. Haptic and acoustic feedback is generated by the deformation and an audible click.
  • lighting elements are also arranged on the surface of the input unit 34 and feedback about operable areas and / or areas is displayed using an optical parameter, for example by highlighting them by means of a brightness or color. The user can thus see in which areas an actuation can be carried out.
  • visual feedback can also be output during operation, for example by changing an optical parameter, for example by the lighting following the course of a swiping gesture, an area lighting up after an actuation or alternatively actuatable areas being highlighted when prompted.
  • Different surfaces can also be illuminated in different ways, for example with different colors, brightness or flashing frequency, in order to indicate different input options, such as acceptance or rejection, activation or deactivation.
  • lighting elements can be arranged on the edge of the input unit 34.
  • the input unit 34 is designed with two layers lying one on top of the other and has a gap at the edge, in which actuation takes place through pressure and elastic deformation of the layers in the area of the gap. Its operability can be indicated by lighting in the gap. Furthermore, the lighting can change when it is actuated, in particular to confirm this.
  • touches, the exertion of pressure on a position of the interaction surfaces 35a, 35b, 35c or gestures executed with the input unit 34 are recorded as actuation actions.
  • actuation actions As a function of this, a user can interact with virtual objects of the user interface, for example by making a selection or actuation, scrolling or shifting or making an entry.
  • various operating states are assigned to the configurations 36a, 36b.
  • these are defined in such a way that the user interface in the first operating state includes a menu with widgets for information output and fast operation of individual functionalities, while the user interface in the second operating state includes a selection of displayable widgets or operating options for functionalities. That is, in the first operating state, a user can use the display to capture information and, if necessary, act on the operation of functions linked to the widgets; it is therefore a display state. In contrast, it is possible for the user in the second operating state to use the information and Select functions, for example by selecting from a menu or a list; it is therefore a configuration state.
  • the first and second operating states differ in a direction marked for operation, in particular a direction in which a list can be searched through by scrolling or paging. If the elements of the list to be searched can be searched in a horizontal direction, for example by means of a swiping gesture carried out in a horizontal direction, the first operating state is activated and, for example, the deformation element 36a is aligned in a horizontal configuration. In contrast, a vertical configuration 36b can be set if the elements of the list can be searched in a vertical direction, for example by a swiping gesture carried out in the horizontal direction.
  • first and second operating status are provided.
  • a larger number of operating states is possible, for example different angles can be set for the deformation element 36a, 36b and / or different interaction surfaces 35a, 35b, 35c can be formed.
  • the example automatically switches between the operating states when a signal for switching is detected.
  • a signal can be generated automatically, for example when performing a specific function.
  • the signal can be generated on the basis of a user input, for example by selecting a specific virtual object.
  • the operating state can also be changed in that the deformation element 36a, 36b is displaced by an external force, for example by a user rotating the deformation element 36a, 36b. In this way, the user can switch between a display and a configuration state in particular.
  • the output includes a schematic display of the currently set configuration 36a, 36b of the input unit 34 and / or another display of the currently activated operating mode.
  • the input unit 41 has a triangular shape, in particular the shape of an equilateral triangle.
  • a deformation element 42a, 42b, also designed as an equilateral triangle, is arranged in a central area, the orientation of which can be rotated between a first orientation 42a, in which the tips of the deformation element 42a point to the sides of the triangular shape of the input unit 41, and a second orientation 42b, in which the sides of the deformation element 42b are essentially parallel to the sides of the triangular shape of the input unit 41.
  • This rotation can be carried out actively by the input unit 41 or it can be carried out passively when a force is exerted on the deformation element 42a, 42b from the outside.
  • the input unit 41 has three interaction surfaces 43, 44, 45, which extend essentially over the entire surface of the input unit 41 facing the user, beyond the deformation element 42a, 42b.
  • the interaction surfaces 43, 44, 45 are designed as touch-sensitive surfaces and can also be actuated by exerting pressure perpendicular to the surface of the input unit 41.
  • FIGS. 5A to 5E A third example of a change in shape of the input unit of the operating system is explained with reference to FIGS. 5A to 5E. This is based on the examples explained above.
  • the input unit 51 is designed essentially analogously to the examples explained above. It has the shape of a rounded isosceles triangle, with the apex of the triangle pointing upwards and the laterally downwardly extending sides of the triangle being longer than the horizontally extending sides. It is assumed that a user rotates the input unit 51 from an original position, as shown in FIG. 5A, about an axis running perpendicular to the image surface, for example by an angle of 120 °.
  • FIGS. 5A to 5E show a simultaneous change in the shape of the input unit 51 in each case after a specific time step, the earlier shape 52 of the input unit 51 being indicated at the previous time step.
  • the shape of the input unit 51 changes in such a way that the input unit 51 is rotated by 180 ° at the end point of the movement, shown in FIG. 5E, compared to the initial shape shown in FIG. 5A.
  • the proportions of the side lengths shift so that the previously The shorter side assumes the length of the two longer legs of the triangle, while one of the longer sides is shortened and is located at the top at the end of the movement.
  • the apex of the triangle points down towards the end of the movement, shown in Figure 5E.
  • the example shown here can also be designed in such a way that the shape of the input unit 51 changes completely actively so that the states shown in FIGS. 5A to 5E are reached. In this case there is no rotation by a user, but the apparent rotation by 180 ° takes place exclusively by changing the edge lengths of the triangle.
  • the input unit 51 is held in a holder or it is supported in another way.
  • the input units 4, 34, 41, 51 each include sensors for detecting the position and an acceleration of the input unit 4, 34, 41, 51. In this way, it can be detected, for example, when the input unit 4, 34, 41, 51 is rotated in one direction. Furthermore, an alignment of the input unit 4, 34, 41, 51 can be detected, in particular to implement a pointing gesture in which a point within the user interface is determined based on the alignment of the input unit 4, 34, 41, 51, for example around this position mark or to mark a virtual object arranged there, select, actuate or operate in another way. Furthermore, gestures can be recorded in which a user moves the input unit 4, 34, 41, 51 in three-dimensional space along a specific trajectory, possibly at a specific speed and / or acceleration.
  • the input unit 4, 34, 41, 51 is arranged on a handlebar of the vehicle 1. It can be, for example, the handlebar or a comparable control device of a motor vehicle, a bicycle, a drone or a scooter.
  • the input unit 4, 34, 41, 51 is particularly movably and removable. A user can transfer the input unit 4, 34, 41, 51, in particular if it is individually personalized for him, between different vehicles 1 and thus achieve similar operation in different contexts.
  • Virtual object List a virtual object; List entries
  • Input unit (previous outline)

Landscapes

  • Engineering & Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Optics & Photonics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un système utilisateur comprenant une unité de commande (2) conçue pour générer des données de sorties pour une interface utilisateur graphique, une unité de sortie (3) conçue pour sortir les données de sortie et une unité d'entrée (4 ; 34) pour acquérir une opération d'actionnement. Au moins un premier et un deuxième état opérationnel peuvent être actionnés pour l'interface utilisateur. L'unité de commande (2) est conçue pour générer, en fonction du premier ou du deuxième état opérationnel activé de l'interface utilisateur, un signal de commande de forme et à le transmettre à l'unité d'entrée (4 ; 34), cette unité d'entrée (4 ; 34) comportant un élément de mise en forme (6; 36a, 36b) conçu pour développer, au moyen du signal de commande de forme, une première forme de l'unité d'entrée (4 ; 34) si le premier état opérationnel est activé, et une deuxième forme géométrique de l'unité d'entrée (4 ; 34) si le deuxième état opérationnel est activé. Cette invention concerne également un procédé pour faire fonctionner un système utilisateur.
PCT/EP2020/071934 2019-08-15 2020-08-04 Système utilisateur et procédé pour faire fonctionner un système utilisateur Ceased WO2021028274A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102019212278.7 2019-08-15
DE102019212278.7A DE102019212278A1 (de) 2019-08-15 2019-08-15 Bediensystem und Verfahren zum Betreiben des Bediensystems

Publications (1)

Publication Number Publication Date
WO2021028274A1 true WO2021028274A1 (fr) 2021-02-18

Family

ID=71950637

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2020/071934 Ceased WO2021028274A1 (fr) 2019-08-15 2020-08-04 Système utilisateur et procédé pour faire fonctionner un système utilisateur

Country Status (2)

Country Link
DE (1) DE102019212278A1 (fr)
WO (1) WO2021028274A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102023136026A1 (de) * 2023-12-20 2025-06-26 Audi Aktiengesellschaft Verfahren zum Steuern einer Mixed-Reality-Funktion und Kraftfahrzeug

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011075390A1 (fr) * 2009-12-18 2011-06-23 Honda Motor Co., Ltd. Bloc morphable pour commande tactile
US20130002570A1 (en) * 2011-06-30 2013-01-03 Lg Electronics Inc. Mobile terminal
WO2013173624A2 (fr) * 2012-05-16 2013-11-21 Tactus Technology, Inc. Interface utilisateur et procédés
EP3217269A1 (fr) * 2016-03-07 2017-09-13 Immersion Corporation Systèmes et procédés pour éléments de surface haptiques

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011075390A1 (fr) * 2009-12-18 2011-06-23 Honda Motor Co., Ltd. Bloc morphable pour commande tactile
US20130002570A1 (en) * 2011-06-30 2013-01-03 Lg Electronics Inc. Mobile terminal
WO2013173624A2 (fr) * 2012-05-16 2013-11-21 Tactus Technology, Inc. Interface utilisateur et procédés
EP3217269A1 (fr) * 2016-03-07 2017-09-13 Immersion Corporation Systèmes et procédés pour éléments de surface haptiques

Also Published As

Publication number Publication date
DE102019212278A1 (de) 2021-02-18

Similar Documents

Publication Publication Date Title
EP1998996B1 (fr) Serveur interactif et procédé permettant de faire fonctionner le serveur interactif
EP2338106B1 (fr) Système d'affichage et de commande multifonctionnel et procédé de réglage d'un tel système avec une représentation de commande graphique optimisée
DE102016219845B4 (de) Berühreingabeeinrichtung und Fahrzeug, welches die Berühreingabeeinrichtung beinhaltet
EP3507681B1 (fr) Procédé d'interaction avec des contenus d'image qui sont représentés sur un dispositif d'affichage dans un véhicule
DE102014116292A1 (de) System zur Informationsübertragung in einem Kraftfahrzeug
EP3642695B1 (fr) Procédé de fonctionnement d'un dispositif d'affichage ainsi qu'un véhicule automobile
EP3508968A1 (fr) Procédé de fonctionnement d'une interface homme-machine ainsi qu'interface homme-machine
EP3807119B1 (fr) Procédé pour faire fontionner un dispositif d'affichage et d'entree, dispositif d'affichage et d'entree et véhicule à moteur
EP2883738B1 (fr) Procédé et système de commande de fonctions d'un véhicule automobile
EP3508967A1 (fr) Procédé de fonctionnement d'une interface homme-machine ainsi qu'interface homme-machine
DE102017121342A1 (de) Anzeigesteuerungsvorrichtung, anzeigesteuerungssystem und anzeigesteuerungsverfahren
WO2014108147A1 (fr) Zoom et déplacement d'un contenu d'image d'un dispositif d'affichage
EP3573854A1 (fr) Procédé permettant de faire fonctionner un système de commande, système de commande et véhicule comprenant un système de commande
WO2021028274A1 (fr) Système utilisateur et procédé pour faire fonctionner un système utilisateur
EP3966064B1 (fr) Procédé pour faire fonctionner un système de commande utilisateur dans un véhicule et système de commande utilisateur dans un véhicule
DE102013211046B4 (de) Verfahren und Vorrichtung zum Gewinnen eines Stellsignals aus einer Bediengeste
DE102013021814A1 (de) Bedienvorrichtung mit Eyetracker
DE102015201722A1 (de) Verfahren zum Betreiben einer Eingabevorrichtung, Eingabevorrichtung
EP3025214B1 (fr) Procédé de fonctionnement d'un dispositif d'entrée et dispositif d'entrée
WO2024046612A1 (fr) Commande d'une fonction à bord d'un véhicule à moteur
DE102011121585B4 (de) Kraftfahrzeug
DE102019131944A1 (de) Verfahren zur Steuerung zumindest einer Anzeigeeinheit, Kraftfahrzeug und Computerprogrammprodukt
EP3143483B1 (fr) Procédé de commande et système de commande dans un véhicule
DE102016008049B4 (de) Verfahren zum Betreiben einer Bedienvorrichtung, Bedienvorrichtung und Kraftfahrzeug
DE102017210599A1 (de) Verfahren zum Positionieren eines digitalen Anzeigeinhalts auf einer Anzeigeeinrichtung eines Kraftfahrzeugs, Steuervorrichtung und Kraftfahrzeug mit Steuervorrichtung

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20751534

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 20751534

Country of ref document: EP

Kind code of ref document: A1