WO2018123355A1 - Dispositif d'interface utilisateur et dispositif electronique - Google Patents
Dispositif d'interface utilisateur et dispositif electronique Download PDFInfo
- Publication number
- WO2018123355A1 WO2018123355A1 PCT/JP2017/041783 JP2017041783W WO2018123355A1 WO 2018123355 A1 WO2018123355 A1 WO 2018123355A1 JP 2017041783 W JP2017041783 W JP 2017041783W WO 2018123355 A1 WO2018123355 A1 WO 2018123355A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- control unit
- touch panel
- menu
- user interface
- display
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B29/00—Maps; Plans; Charts; Diagrams, e.g. route diagram
Definitions
- the present invention relates to a user interface device and an electronic device.
- touch panels that can be easily and sensibly operated with a finger or the like have become widespread, and various studies and developments have been made on downsizing, thinning, lightening, power saving, and cost reduction of touch panels.
- a position where an instruction medium such as a finger touches the touch panel referred to as “touch position”
- a resistance film type that detects a change in electrical resistance
- a surface acoustic wave method that uses ultrasonic waves
- an electrostatic A capacitance method for detecting a change in capacitance is known.
- a capacitive touch panel is attracting attention in that it can detect a plurality of touch positions.
- the capacitive touch panel includes a transparent electrode that generates a capacitance and an external circuit that detects a change in the capacitance.
- a touch panel including a pressure-sensitive sensor capable of detecting a pressing force when the touch panel is pressed with a stylus pen, a finger, or the like is being provided.
- Patent Document 1 input determination by pressing on the input operation surface is performed based on a change in capacitance of the upper electrode and lower electrode of the pressure-sensitive sensor displaced by the top plate and the touch panel moving in the pressing direction.
- a technique for determining that an operation has been performed is disclosed.
- Patent Document 2 discloses a sensor device including a pressure-sensitive sensor that does not use an elastic member.
- Patent Document 3 an enlarged image in which an image displayed on the screen is enlarged at a predetermined enlargement display magnification according to the proximity of a finger to the screen is generated, and a part of the image becomes a locally enlarged image.
- a technique for displaying an image on a screen is disclosed.
- JP 2011-134000 A Japanese Unexamined Patent Publication No. 2011-100364 JP 2013-225261 A
- Touch panels are used in various electronic devices.
- an electronic device using a touch panel for example, there is a navigation device that is mounted on a car and displays a map, directions, and the like from the current position of the vehicle to a destination on a display panel.
- a navigation device requires a simple operation.
- the touch panel attached to the conventional navigation device has only a function of detecting the XY coordinates of the touch position, and it is inconvenient because a complicated operation must be performed by the user to obtain a desired result. .
- the pressure sensitive sensor is merely used in an auxiliary manner in order to reliably detect that a finger or the like has touched the touch panel. For this reason, even if it is the touch panel which applied the technique disclosed by patent document 1 and 2, operation itself is the same as that of the touch panel which does not have a conventional pressure-sensitive function, and the inconvenience of operation was not eliminated. . Moreover, since the operation using an external input device etc. is requested
- the present invention has been made in view of such a situation, and an object thereof is to make it possible to easily select a specific layer from a plurality of layers.
- a user interface device includes a display panel that displays an object, a display control unit that performs control to display an object on the display panel according to control information, a touch panel that performs a touch operation using an instruction medium, and a touch operation that is performed.
- a coordinate detector that detects the coordinates of the touch position of the touch panel and outputs coordinate detection information; a pressure sensor that outputs a sensor value that changes according to the pressing force applied to the touch panel by the pointing medium; and a sensor value ,
- a pressure sensitive detection unit that detects that a push-in operation in which the touch panel is pushed by the instruction medium is performed and outputs pressure sensitive detection information, a touch operation determined based on the coordinate detection information, and pressure sensitive Based on the gesture operation that combines the push operation determined based on the detection information, Outputs control information for displaying one layer selected from multiple layers related to the display panel to the display control unit, and determines the touch operation performed on the object through the one layer displayed on the display panel A control unit.
- An electronic apparatus includes the above-described user interface device and an electronic apparatus main body.
- the electronic device main body performs a predetermined process based on content instructed by a touch operation or a gesture operation input from the control unit, and outputs an object on which the predetermined process has been performed to the control unit of the user interface device.
- a user can appropriately change a layer by performing an intuitive gesture operation that combines a touch operation and a push-in operation.
- FIG. 1 is a schematic configuration diagram of a user interface device according to a first embodiment of the present invention. It is explanatory drawing of the operation principle of the pressure-sensitive sensor which concerns on the 1st Embodiment of this invention. It is a flowchart which shows the example of operation which selects the conventional menu item.
- FIG. 1 is an explanatory diagram showing a state when a driver seat and a passenger seat of a vehicle on which the navigation device 1 is mounted are viewed from the rear in the traveling direction toward the front.
- the navigation device 1 (an example of an electronic device) is installed at a position sandwiched between the dashboard 5 and the meter panel 6 and visible to a user who holds the steering wheel 7.
- a shift lever 8 is provided below the navigation device 1.
- the navigation device 1 includes a navigation device body 2 that performs processing necessary for navigation, and a user interface device 3.
- the navigation device body 2 is fitted in a recess formed in the dashboard 5 and cannot be directly recognized by the user.
- the user interface device 3 is disposed at a position where the user can visually recognize, and outputs an instruction input by the user to the navigation device main body 2 and indicates a processing result by the navigation device main body 2 to the user.
- the user interface device 3 is configured to have a sense of unity with the interior shape from the dashboard 5, the meter panel 6 and the shift lever 8 to the base of the windshield 4.
- the user interface device 3 displays information (a map, various icons, etc.) necessary for navigation processing and the like output from the navigation device body 2.
- the instruction medium touching the touch panel 20 is a user's finger.
- a touch operation an operation performed by the user moving his / her finger while touching the touch panel 20
- a pressing operation an operation in which the user presses the touch panel 20 with a finger
- a gesture operation An operation performed by combining the touch operation and the pressing operation.
- the conventional gesture operation is an operation performed only by a touch operation.
- FIG. 2 is a block diagram illustrating an internal configuration example of the user interface device 3. Before describing the internal configuration of the user interface device 3, the operation of the navigation device body 2 will be described.
- the navigation device main body 2 (an example of the electronic device main body) performs predetermined processing based on the instructed content by a touch operation or a gesture operation input from the control unit 10 of the user interface device 3. Then, the navigation device body 2 outputs the object after performing the predetermined processing based on the instruction to the control unit 10 of the user interface device 3.
- Objects that are output from the navigation device body 2 to the user interface device 3 include, for example, maps, character strings, icons, images, and the like.
- the navigation device body 2 performs navigation processing based on the gesture operation performed on the user interface device 3, outputs a map for display by the user interface device 3, and receives a character string input from the user interface device 3. Edit it.
- the user interface device 3 includes a control unit 10, a storage medium control unit 14, a storage medium 15, a communication control unit 16, a coordinate detection unit 17, a pressure sensitive detection unit 18, and a display control unit 19 connected by a bus B.
- the user interface device 3 also includes a touch panel 20 connected to the coordinate detection unit 17, a pressure sensor 30 connected to the pressure detection unit 18, and a display panel 40 connected to the display control unit 19.
- the control unit 10 controls the operation of each unit in the user interface device 3.
- the control unit 10 determines that a touch operation has been performed based on the coordinate detection information input from the coordinate detection unit 17, and outputs information input from the touch position to the navigation device body 2. Further, the control unit 10 determines that the pressing operation in which the touch panel 20 is pressed with a finger is performed based on the pressure-sensitive detection information input from the pressure-sensitive detection unit 18. In addition, the control unit 10 outputs control information for causing the display panel 40 to display the object output from the navigation device body 2 to the user interface device 3 to the display control unit 19.
- control unit 10 displays control information for displaying one layer selected from the plurality of layers related to the object on the display panel 40 based on the gesture operation combining the touch operation and the push-in operation. 19 output. Further, the control unit 10 determines a touch operation (such as a finger movement direction and a touch position) performed on the object through one layer displayed on the display panel 40.
- a touch operation such as a finger movement direction and a touch position
- the control unit 10 includes a CPU 11, a RAM 12, and a ROM 13, and the CPU 11, the RAM 12, and the ROM 13 work together to realize the function of the control unit 10.
- the CPU (Central Processing Unit) 11 is an example of a computer that controls the operation of each unit in the user interface device 3. For example, the CPU 11 executes a program read from the ROM 13 and performs processing related to the gesture operation according to the present embodiment.
- a RAM (Random Access Memory) 12 stores temporary data such as a program executed by the CPU 11.
- ROM (Read Only Memory) 13 stores a program read by the CPU 11 and the like.
- the ROM 13 is used as an example of a computer-readable non-transitory recording medium that stores a program executed by the CPU 11. For this reason, this program is permanently stored in the ROM 13.
- the computer-readable non-transitory recording medium storing the program executed by the CPU 11 may be a recording medium such as a CD-ROM or a DVD-ROM.
- the storage medium control unit 14 controls the storage medium 15.
- the storage medium control unit 14 writes data input from the control unit 10 to the storage medium 15, or reads data stored in the storage medium 15 according to an instruction from the control unit 10 and outputs the data to the control unit 10.
- the storage medium 15 is inserted into a slot or the like provided in the user interface device 3 and stores data written by the storage medium control unit 14 or data is read by the storage medium control unit 14.
- the storage medium 15 stores a navigation program that operates in the navigation apparatus main body 2, upgrade data of a character editing program, and the like.
- the communication control unit 16 controls data communication processing performed through the network N between the navigation device body 2 and the user interface device 3.
- the coordinate detection unit 17 detects the coordinates of the touch position of the touch panel 20 where the touch operation is performed.
- the touch panel 20 is formed in a planar rectangular shape, and the position of the intersection of the X electrode and the Y electrode that intersect each other is coordinate information, and a value corresponding to the change in capacitance at this position is given to the coordinate detection unit 17. Is output.
- the coordinate detection unit 17 detects the coordinate of the location where the coordinate information input from the touch panel 20 has changed as a finger touch position, and outputs coordinate detection information including the coordinate of the touch position to the control unit 10.
- the pressure sensitive detection unit 18 detects that the touch panel 20 is pressed with a finger and the pressure sensitive sensor 30 senses pressure.
- the pressure-sensitive sensor 30 is provided on the back surface of the touch panel 20 and outputs a sensor value that changes according to the pressing force applied to the touch panel 20 by a finger to the pressure-sensitive detection unit 18. Based on the sensor value input from the pressure-sensitive sensor 30, the pressure-sensitive detection unit 18 detects that a pressing operation for pressing the touch panel 20 with a finger is performed, and outputs pressure-sensitive detection information to the control unit 10. To do.
- the display control unit 19 performs control to display, for example, an icon, a map, and the like necessary for navigation on the display panel 40 formed in a planar rectangular shape according to the control information input from the control unit 10.
- FIG. 3 is a schematic configuration diagram of the user interface device 3.
- the position of the pressure-sensitive sensor 30 installed on the touch panel 20 as viewed from above and the position of the frame of the housing 55 are indicated by broken lines.
- Six pressure-sensitive sensors 30 are provided on the back surface of the touch panel 20 on the frame of the housing 55 of the user interface device 3 including the touch panel 20. For this reason, the user does not see the pressure sensor 30 directly.
- a plurality of pressure-sensitive sensors 30 are usually provided, but a plurality of pressure-sensitive sensors 30 are not necessarily provided, and may be one.
- the top plate 50 protects the surface of the touch panel 20.
- a transparent glass substrate, a film, or the like is used for the top plate 50.
- the surface of the top plate 50 is an input operation surface 51 on which a user contacts a finger in order to perform a touch operation.
- the touch panel 20 is configured, for example, by laminating a transparent X electrode substrate 21, an adhesive layer 22, and a Y electrode substrate 23 in this order.
- the top plate 50 and the X electrode substrate 21 are bonded and fixed by an adhesive layer 52.
- Each of the X electrode substrate 21 and the Y electrode substrate 23 has a rectangular shape.
- the X electrode substrate 21 and the Y electrode substrate 23 are bonded by the adhesive layer 22.
- An area where the X direction detection electrode (not shown) formed on the X electrode substrate 21 and the Y direction detection electrode (not shown) formed on the Y electrode substrate 23 overlap each other in a plane is a coordinate detection area on the XY plane.
- the pressure sensitive sensor 30 is disposed in a peripheral area (frame) outside the coordinate detection area on the XY plane of the touch panel 20.
- the pressure-sensitive sensor 30 includes an elastic body 33 made of a dielectric material disposed between the touch panel 20 and the housing 55, and an upper electrode 31 and a lower electrode 35 that are disposed so as to sandwich the elastic body 33 and form a capacitor. Is provided.
- the pressure-sensitive sensor 30 further includes an adhesive layer 32 that bonds and fixes the elastic body 33 and the upper electrode 31, and an adhesive layer 34 that bonds and fixes the elastic body 33 and the lower electrode 35.
- the elastic bodies constituting the six pressure-sensitive sensors 30 are connected to form one frame-shaped elastic body 33, and the six pressure-sensitive sensors 30 form one elastic body 33. Sharing.
- the elastic body 33 By providing the elastic body 33 in a ring shape, it is possible to prevent foreign dust or the like from entering the gap 41 between the touch panel 20 and the housing 55, that is, between the touch panel 20 and the display panel 40.
- the lower electrodes 35 constituting the six pressure sensors 30 are connected to constitute one frame-like lower electrode 35, and the six pressure sensors 30 share one lower electrode 35.
- the upper electrode 31 may also be formed in a frame shape like the lower electrode 35.
- the elastic body 33 for example, a material having a small residual strain and a high restoration rate (restoration speed) is used.
- the material used for the elastic body 33 include silicone rubber and urethane rubber.
- the elastic body 33 may be displaced about 10% at the maximum with respect to the height of the original elastic body 33, for example.
- the elastic body 33 having a thickness of 0.5 mm used for the pressure-sensitive sensor 30 may be displaced by about 10 ⁇ m.
- the elastic body 33 of the pressure sensor 30 is distorted so that the pressure sensor 30 is bonded and fixed.
- the top plate 50 and the touch panel 20 are moved in the pressing direction.
- the pressure sensor 30 is pressed, its thickness is displaced in the pressing direction.
- the back surface of the touch panel 20 approaches the front surface of the display panel 40 by the amount of displacement of the pressure sensor 30, so that a gap 41 is provided between the touch panel 20 and the display panel 40 in consideration of the movement of the touch panel 20. ing.
- the gap 41 is not provided when the touch panel 20 and the display panel 40 are bonded.
- the pressure-sensitive sensor 30 outputs a sensor value corresponding to the capacitance of the capacitor formed by the upper electrode 31 and the lower electrode 35 to the pressure-sensitive detection unit 18.
- FIG. 4 is a diagram for explaining the operating principle of the pressure-sensitive sensor 30.
- the description of the touch panel 20 is omitted.
- An example of the pressure-sensitive sensor 30 that is not pressed with a finger is shown in the upper left of FIG. 4, and an example of the pressure-sensitive sensor 30 that is pressed with a finger is shown in the upper right of FIG.
- the elastic body 33 is distorted so that the thickness decreases.
- the electrostatic capacitance of the pressure sensor 30 changes because the thickness of the pressed pressure sensor 30 changes with respect to the thickness of the pressure sensor 30 which is not pressed.
- the pressure-sensitive detection unit 18 uses the capacitance change rate between the upper electrode 31 and the lower electrode 35 due to the displacement d of the elastic body 33, the touch panel 20 is pressed, and the pressure-sensitive sensor 30 senses pressure.
- the capacitance change rate is obtained based on sensor values output from the upper electrode 31 and the lower electrode 35 by the pressure-sensitive detection unit 18.
- the sensor value is a voltage determined by the capacitance between the upper electrode 31 and the lower electrode 35. That is, the rate of change in capacitance is the static between the upper electrode 31 and the lower electrode 35 of the pressed pressure sensor 30 with respect to the capacitance between the upper electrode 31 and the lower electrode 35 of the pressure sensor 30 that is not pressed. It is obtained as a percentage of electric capacity.
- the pressure-sensitive detection unit 18 obtains a change in capacitance between the upper electrode 31 and the lower electrode 35 based on a sensor value input from the pressure-sensitive sensor 30, that is, obtains a rate of change in capacitance. Is possible.
- the pressure-sensitive detection unit 18 detects the sensor value detected by each pressure-sensitive sensor 30 disposed on the back surface of the touch panel 20 based on the graph showing the relationship between the pressing force at the bottom of FIG. It is converted into a pressing force when the touch panel 20 is pressed. Then, when the pressing force exceeds the pressing threshold, the pressure-sensitive detection unit 18 determines that the user has consciously pressed the touch panel 20 and outputs pressure-sensitive detection information to the control unit 10.
- the pressure-sensitive detection unit 18 may obtain the pressing force based on the total value of the respective capacitance change rates detected by the pressure-sensitive sensors 30. Thereby, it is possible to detect the pressing force with high accuracy without depending on only the touch position on the input operation surface 51. Note that the pressure-sensitive detection unit 18 may obtain the pressing force from an average value obtained by dividing the total value of the capacitance change rates by the number of the pressure-sensitive sensors 30, for example.
- the pressure-sensitive detection unit 18 may output pressure-sensitive detection information that varies depending on the magnitude of the pressing force to the control unit 10. For example, in the lower graph of FIG. 4, the pressing force at which the capacitance change rate is 2.0% is defined as a threshold th1 (an example of the first pressing threshold value), and the pressing force at which the capacitance change rate is 6.0%. Let the pressure be a threshold th2 (an example of a second pressing threshold). If the pressing force is less than the threshold th ⁇ b> 1, it is considered that the capacitance has only changed due to vibration applied to the user interface device 3 through the housing 55. That is, it is assumed that the user does not intentionally press the finger into the touch panel 20.
- the pressure-sensitive detection unit 18 does not determine that the user has pressed the finger into the touch panel 20, and therefore does not output pressure-sensitive detection information. However, if the pressing force is equal to or greater than the threshold th1, the pressure-sensitive detection unit 18 determines that the user has pressed the finger into the touch panel 20, and outputs first pressure-sensitive detection information. Furthermore, if the pressing force is greater than or equal to the threshold th2, the pressure-sensitive detection unit 18 determines that the user has pressed the finger strongly into the touch panel 20, and outputs second pressure-sensitive detection information.
- the pressure-sensitive detection unit 18 uses the first pressure-sensitive detection information or the second pressure-sensitive information based on the pressing force obtained from the capacitance change rate that changes according to the pressing amount of the user pushing the finger into the touch panel 20. Detection information can be output to the control unit 10. As a result, the control unit 10 can perform different processes according to the first pressure detection information or the second pressure detection information input from the pressure detection unit 18. Note that the pressure-sensitive detection information may be only one of the first pressure-sensitive detection information and the second pressure-sensitive detection information. When the first pressure detection information or the second pressure detection information is not distinguished, it is referred to as “pressure detection information”.
- the control unit 10 determines that the finger has been pushed into the touch panel 20 and the determination operation by the user has been performed when the pressure-sensitive detection information is input from the pressure-sensitive detection unit 18. At this time, the control unit 10 vibrates the housing 55 or displays a message on the display panel 40 in order to inform the user that it is in the pressed state. Further, a voice guide may be emitted from a speaker (not shown). Or the control part 10 can also display icons, such as a circle and a square, in the pressed touch position. In this case, the icon may be blinked, or the icon display color may be changed.
- the housing 55 can also generate a click sound or change the touch of the finger when the touch panel 20 is pushed.
- FIGS. 5 and 6 Each step of the conventional flowchart shown in FIG. 5 and the flowchart according to the first embodiment of the present invention shown in FIG. 6 includes user interface devices 3 and 100 in order to explain specific contents of the gesture operation.
- a display example of the screen displayed on the display panel may be shown.
- an instruction icon for a user to specify a menu item through a touch panel is represented by a black arrow.
- the conventional user interface device includes a conventional touch panel or a mouse that can detect only the XY coordinates of the touch position. Then, the user may select a menu item from the main menu arranged on the menu bar using a conventional user interface device. At this time, after selecting the main menu with the instruction icon, the user selects a target menu item from a plurality of menu items belonging to the lower layer for the main menu. However, when the user moves the instruction icon to the desired menu item, if the instruction icon goes out of the menu item display area, the menu item is not displayed, and the menu item must be displayed again by selecting the main menu. I had to.
- FIG. 5 is a flowchart showing an example of an operation for selecting a conventional menu item.
- a control unit (not shown) provided in the conventional user interface device 100 detects that a finger touches the touch panel provided in the user interface device 100 (S1).
- a control unit of a conventional user interface device is simply referred to as a “control unit” without a symbol.
- the control unit determines whether or not the user has performed a menu operation for selecting the main menu from the menu bar (S2).
- the control unit determines that the menu operation is not performed by the user (NO in S2), the operation performed on the touch panel is an operation of moving the instruction icon to a place other than the menu bar. For this reason, the control unit performs the operation of moving the instruction icon to move the instruction icon in accordance with the movement direction of the user's finger (S3), and then ends this process.
- step S2 if it is determined in step S2 that the menu operation has been performed by the user (YES in S2), the menu item included in the lower layer of the main menu indicated by the instruction icon is displayed in the pull-down list in the first menu m1. (S4).
- the screen added to step S4 as a result of the user selecting the FILE menu as the main menu from the menu bar at the upper left of the screen displayed on the conventional user interface device 100, the first menu in the lower layer of the FILE menu is displayed.
- a state in which the menu m1 and the second menu m2 are displayed in the pull-down list is shown. Both the first menu m1 and the second menu m2 are a group of a plurality of menu items.
- Menu items of the first menu m1 include, for example, “NEW”, “OPEN”, and the like, and menu items of the second menu m2 include, for example, “PRINT”.
- the second menu m2 is simultaneously displayed below the first menu m1, but when the display panel is small, a drop-down displayed below the first menu m1. The second menu m2 is not displayed unless the button is clicked or the screen is scrolled.
- the user determines whether or not to select the second menu m2 displayed in the pull-down list (S5).
- the second menu m2 is not selected (NO in S5)
- the user selects a menu item of the first menu m1 (S6).
- the control unit determines the selected menu item and ends the process.
- step S5 when the user selects the second menu m2 (YES in S5), the user moves the instruction icon and displays the second menu m2 (S7).
- the screen added in step S7 shows that the user has selected “PRINT” included in the second menu m2 from the FILE menu item.
- the control unit determines that the menu item of the second menu m2 is selected and the finger is removed from the touch panel, determines the selected menu item (S8), and ends the present process.
- the menu item displayed by the user is selected by selecting the lower layer menu in order to select a menu item finally required by the user.
- the operation to move the instruction icon to was performed.
- the moved instruction icon deviates from the menu item, the selection of the main menu and the lower layer menu becomes invalid.
- the instruction icon must be moved accurately.
- the user In order to select the desired menu item from the pull-down list containing many menu items, the user must move his / her finger greatly, and the finger is accidentally moved from the touch panel at a position different from the intended menu item. It was easy to make an operation mistake.
- FIG. 6 is a flowchart illustrating an example of an operation for selecting a menu item according to the first embodiment.
- a target menu item can be selected from a layer related to the menu item by a simple operation using one finger. That is, the control unit 10 according to the first embodiment displays control information for displaying one layer from a plurality of layers including different menu items when the touch panel 20 is pressed by a finger. Can be output.
- the layers include a first layer and a second layer.
- the first layer includes a first menu m1 that is displayed when a menu operation is performed on the touch panel 20.
- the second layer includes a second menu m2 that is displayed when the touch panel 20 is pressed with a finger on the first menu m1.
- the gesture operation is an operation in which the user presses the finger on the touch panel 20 on the first menu m1 to display the second menu m2, and selects an arbitrary menu item from the second menu m2. is there.
- one layer selected from a plurality of layers related to an object is displayed on the display panel 40.
- one object for example, a file, a graphic, or a character string
- the layer for example, the main menu
- one lower layer is selected from the plurality of lower layers, and this lower layer is selected. Is to display the menu items contained in the layer.
- the coordinate detection unit 17 detects that the touch has been made (S11), and detects the coordinates of the touch position.
- a menu operation for selecting the main menu from the menu bar is performed with the finger touched on the touch panel 20 by the user (S12). If the user does not perform the menu operation (NO in S12), the operation performed on the touch panel 20 is an operation for moving the instruction icon to a place other than the menu bar. For this reason, the control part 10 complete
- the control unit 10 confirms that the finger moved by the user has moved to the menu bar at the top of the screen based on the coordinate detection information input from the coordinate detection unit 17. judge. For example, when the finger moves to the FILE menu on the menu bar, the control unit 10 outputs control information for displaying only the first menu m1 on the display panel 40 to the display control unit 19 (S14). At this time, only the first menu m1 which is a lower layer of the main menu is displayed in the pull-down list.
- step S14 the user selects the FILE menu from the menu bar at the top left of the screen by touch operation, and only the menu items of the first menu m1 ("NEW”, "OPEN”, etc.) are displayed in the pull-down list. The state of being done is shown.
- the user selects an operation of pushing a finger into the touch panel 20.
- the pressure sensor 18 determines whether or not the pressure sensor 30 has detected pressure (S15).
- the control unit 10 selects the menu item of the first menu m1 and releases the finger from the touch panel 20.
- the selected menu item is confirmed (S16), and this process is terminated.
- the control unit 10 determines that the touch panel 20 has been pushed by the finger on the first menu m1. Then, the control unit 10 outputs to the display control unit 19 control information for displaying the second menu m2 in the vicinity of the touch position where the finger is pressed.
- the second menu m2 is displayed at a position overlapping the first menu m1 already displayed on the display panel 40 (S17). For this reason, the user can select the menu item of the second menu m2 with almost no finger moving from the position where the touch panel 20 is pressed (S18). Then, when the menu item of the second menu m2 is selected and the finger is removed from the touch panel 20, the control unit 10 determines the selected menu item and ends this process.
- the first menu m1 or the second menu m2 is selected by a simple gesture operation of pressing the touch panel 20 using one finger, You can select a menu item to do. For this reason, the user can handle the gesture operation according to the present embodiment sensibly and intuitively as compared with the operation of moving the instruction icon greatly in order to select a conventional menu item.
- the second menu m2 is displayed over the first menu m1, the user's line-of-sight movement may be reduced.
- the movement distance of the finger can be shortened and the instruction icon need not be moved, the user can quickly switch between a plurality of layers as well as reduce operation errors.
- the navigation device body 2 may output control information to the user interface device 3 so that it cannot be operated using the user interface device 3 while the vehicle is traveling. Conversely, control information that allows the user interface device 3 to be operated while the vehicle is stopped may be output to the user interface device 3.
- the first menu m1 and the second menu m2 may be displayed using the user interface device 3 according to the first embodiment.
- the second menu m2 is a menu item after “PRINT”, it can be changed to an arbitrary menu item. It is also possible to set three or more lower layers in one menu. In this case, by pressing the touch panel 20 at the position where the user has selected the menu, the layers can be sequentially switched and displayed. For example, in a state where the first menu m1 is displayed, the second menu m2 is displayed if the finger pressing force on the touch panel 20 is greater than or equal to the threshold th1, and the second menu m2 is displayed if the finger pressing force is greater than or equal to the threshold th2. Another menu may be displayed instead of the menu m2. Therefore, the user can select a layer including a target menu item from a plurality of layers.
- the target layer may be selectable by the user using only the threshold th1.
- the lower layer includes a third menu in addition to the first menu m1 and the second menu m2 described above.
- the first menu m1 is switched to the second menu m2 and displayed.
- the second menu m2 Is switched to the third menu and displayed.
- the third menu is displayed when the touch panel 20 is pressed again with a pressing force of the threshold th1 or more. It is switched to 1 menu m1 and displayed.
- the first menu m1, the second menu m2, and the third menu can be cyclically displayed.
- a menu selection area where the user can select various menus may be displayed on any one side of the display panel 40.
- menu items corresponding to the menu displayed in the menu selection area may be displayed on the display panel 40.
- a conventional user interface device equipped with a touch panel that can detect only the XY coordinates of the touch position in order for the user to perform different work for each layer, the target layer is selected from the menu item that lists the switching destination layers. Was selected.
- FIGS. 7 and 8 an operation example using the conventional user interface device 100 and an operation example using the user interface device 3 according to the first embodiment will be described with reference to FIGS. 7 and 8.
- the screen of the user interface device 3, 100 and the front view, the bottom view, and the left side view of the user interface device 3100 at a certain moment are simultaneously displayed within a rectangular broken line. ing.
- a white arrow in the screen indicates the moving direction of the finger.
- FIG. 7 is a flowchart showing an example of a conventional layer switching operation. Note that the processing in steps S21 and S22 in FIG. 7 is the same as the processing in steps S1 and S2 in FIG. 5 in the first embodiment described above, and thus detailed description thereof is omitted.
- step S22 If it is determined in step S22 that the menu operation has been performed by the user (YES in S22), the control unit displays a menu item for switching the layer at the position touched by the user's finger (S23).
- the screen added to step S23 shows that the user selects the layer switching menu from the menu bar at the upper left of the screen, and the name of the switching destination layer included in the layer switching menu is displayed as a menu item in the pull-down list.
- “A” displayed in the center of the screen indicates that the name of the currently displayed layer is “Layer A”.
- the menu items displayed in the pull-down list are the three items of layers A, B, and C.
- step S24 the user selects a menu item for the layer to be switched to (S24).
- the control unit determines that the switching destination layer has been selected from the menu items displayed in the pull-down list.
- the screen added to step S24 shows a state where the user moves the finger along the pull-down list and selects the menu item of layer B.
- control unit switches to the menu item layer selected by the user (S25). Then, when the user removes his / her finger from the touch panel, the control unit determines and displays the switched layer (S26), and ends this process.
- the screen added in step S26 shows how layer B is displayed when the user lifts his / her finger from the touch panel.
- FIG. 8 is a flowchart illustrating an example of a layer switching operation according to the second embodiment.
- the layers can be switched by a simple operation using one finger. That is, when the touch panel 20 is pushed by a finger, the control unit 10 switches the display information to the display control unit 19 so that the display panel 40 displays control information for switching an object from one layer to a single operable layer. Output.
- the gesture operation is an operation in which the user presses the finger on the touch panel 20 to switch the layer and releases the touch panel 20 to confirm the switched layer.
- displaying one layer selected from a plurality of layers related to an object on the display panel 40 means that when there are different layers A, B, C, for example, a plurality of layers A , B, and C, one layer B is selected, and the object is displayed in the layer B so as to be editable. Note that layers A, B, and C are all layers related to the object to be edited.
- the coordinate detection unit 17 detects that the touch has been made (S31), and detects the coordinates of the touch position.
- the screen added in step S31 shows a state in which the user touches the left side of the screen with one finger. At this time, layer A is displayed on the user interface device 3.
- step S32 shows a state in which the user touches the upper left of the screen with one finger and further presses the touch panel 20. Then, the fact that the touch panel 20 has been pressed with a finger is indicated by the finger moving direction represented by the downward arrow with respect to the touch panel 20.
- the pressure-sensitive detection unit 18 determines whether or not the pressure-sensitive sensor 30 has detected pressure based on the sensor value output from the pressure-sensitive sensor 30 (S33). When the pressure detection unit 18 determines that no pressure is detected based on the sensor value (NO in S33), the control unit 10 ends this process.
- the control unit 10 switches the layer and displays control information for displaying the switched layer on the display panel 40. It outputs to the display control part 19 (S34).
- the screen added in step S34 shows a state where the screen is switched from layer A to layer B when the user presses the finger on the touch panel 20.
- step S35 shows a state in which layer B is determined and displayed when the user removes his / her finger from touch panel 20. Thereafter, when the user presses the finger again on the touch panel 20, the layer C switched from the layer B is displayed on the user interface device 3, and the user can determine the layer C.
- the layer switching menu is selected from the menu bar and the menu item of the layer is selected, but the layer can be switched by slightly moving a finger.
- you do not have to move your finger almost you can not only reduce operational errors, but also quickly switch between multiple layers.
- image editing software that can draw characters, figures, etc. by handwriting on photos.
- This software can edit images using photo layers and image editing layers.
- the image editing layer is displayed, the user presses the touch panel 20 to change the image editing layer to the photo layer, thereby changing the display position of the photo, changing the orientation of the photo, or scaling the photo. Is possible. For this reason, layers can be switched by an intuitive operation without selecting a target layer from the layer list as in the prior art.
- the layer to be switched when the touch panel 20 is pressed may be changed. For example, first, when the image editing layer is displayed, when the user presses the touch panel 20 with a finger, the display is switched to the photo layer. Next, when the photo layer is displayed, when the user presses the touch panel 20 with a finger, it may be switched to the image editing layer and displayed.
- control unit 10 displays control information for sequentially switching the plurality of layers and displaying them on the display panel 40 in accordance with the duration for which the touch panel 20 is pressed by a finger. 19 may be output.
- the order of the layers to be switched and displayed at this time is cyclic, for example, layers A, B, C, A, B, C,. As a result, for example, even if an unintended layer is displayed, the user can display the target layer because the layers are switched in order only by keeping the finger pressed.
- the layers may be switched according to the amount of finger pressing into the touch panel 20 by the user. For example, when the image editing layer is displayed and the finger pressing force on the touch panel 20 is equal to or greater than the threshold th1, the photo layer is displayed, and when the finger pressing force is equal to or greater than the threshold th2, another menu (for example, (Background layer) may be displayed. Therefore, the user can select a layer including a target menu item from a plurality of layers.
- a gesture operation may be assigned to the slide show function.
- the next slide may be displayed. Accordingly, the user can smoothly execute the slide show without searching for a button for displaying the next slide.
- FIG. 9 is a flowchart showing an example of a conventional shortcut icon selection operation.
- the control unit included in the conventional user interface device 100 detects that a finger touches the touch panel included in the user interface device 100 (S41).
- the screen added in step S41 shows a state in which the user touches the left side of the screen with one finger.
- control unit determines that the map display selection icon has been selected by the user (S42).
- the screen added to step S42 shows how the map is displayed by selecting the map display selection icon displayed on the upper right of the screen with a finger.
- the control unit determines that the destination selection icon has been selected by the user (S43).
- the destination selection icon is an icon for displaying a function-specific icon used for the user to select and input a destination of the vehicle.
- a map is displayed on the screen added to step S43, and a state in which the user selects a destination selection icon displayed on the upper left of the screen with a finger is shown.
- step S44 determines that the home icon has been selected by the user (S44).
- the screen added to step S44 shows an example in which a plurality of function-specific icons are displayed as the destination selection icon.
- These function-specific icons have a shortcut function for realizing a target function with a simple operation (for example, one click), and are displayed in a rectangular frame at the lower right of the destination selection icon.
- the function-specific icons displayed on the destination selection icon are a list icon (upper right of the frame) indicating that a plurality of destinations registered in advance are displayed as a list, and a search icon indicating that a new destination is to be searched.
- a home icon bottom right of the frame
- the home address is automatically determined as the destination without inputting the home address using a software keyboard or the like.
- control unit determines that the selection of the home icon has been determined by the user (S45).
- a YES button and a NO button for selecting whether or not the home icon selected by the user can be determined are displayed, and the user selects the YES button with a finger.
- a preset home address is determined as the destination.
- FIG. 10 is a flowchart illustrating an example of a shortcut icon selection operation according to the third embodiment.
- a necessary function can be selected by a simple operation using one finger. That is, when the touch panel 20 is pressed with a finger while a map or the like (an example of the first layer) is displayed, the control unit 10 includes a shortcut icon (second layer) including a shortcut menu to which a predetermined function is assigned. The control information for displaying (example) is output to the display control unit 19. And the control part 10 performs the function allocated to the shortcut menu selected with the finger in the state in which the touch panel 20 was pushed by the finger.
- the gesture operation is an operation in which a user presses a finger on the touch panel 20 to display a shortcut icon, and selects the shortcut icon while pressing the finger on the touch panel 20.
- displaying one layer selected from a plurality of layers related to an object on the display panel 40 is, for example, a shortcut icon for the first layer displaying a map (object). Is to display the second layer.
- the processing in steps S51 to S53 in FIG. 10 is the same as the processing in steps S31 to S33 in FIG. 8 in the second embodiment described above, and detailed description thereof will be omitted.
- step S53 When it is determined in step S53 that the pressure-sensitive detection unit 18 has detected pressure sensitivity based on the sensor value (YES in S53), the control unit 10 displays control information for displaying a shortcut icon on the display panel 40. (S54).
- shortcut icons On the screen added in step S54, when the user presses the finger on the touch panel 20, shortcut icons having four types of shortcut functions are arranged and displayed in a cross shape with the finger touch position at the center. The situation is shown.
- This shortcut icon represents a home icon, a menu icon, a setting icon, and a telephone icon clockwise from the left.
- the home icon has a function to set the home address as the destination, the menu icon has a function to display various menus, the setting icon has a function to make arbitrary settings, and the telephone icon is set to the registered telephone number. Has a function to make a phone call.
- step S55 shows a state where the home icon is selected in a state where the user's finger is pushed into the touch panel 20.
- the control part 10 performs the shortcut function corresponding to the shortcut icon selected by the user. For this reason, the home address is set as the destination.
- a simple shortcut icon is displayed by a simple gesture operation of pressing the touch panel 20 with one finger. Then, the user can select a shortcut icon by moving the finger pressed into the touch panel 20. For this reason, it is possible to quickly execute a target function without performing a plurality of touch operations as in the prior art. Moreover, since this gesture operation is intuitive and simple, it is easy to select a shortcut icon.
- the control unit 10 displays control information for sequentially switching the plurality of shortcut icons and displaying them on the display panel 40 according to the duration for which the touch panel 20 is pressed by a finger. You may output to the control part 19.
- the order of the shortcut icons displayed by switching at this time is cyclic, for example, a home icon, a menu icon, a setting icon, a telephone icon, a home icon,. For this reason, for example, even if an unintended shortcut icon is displayed, the user simply switches on the shortcut icon (for example, every second) simply by keeping the finger pressed, so that the desired shortcut icon is selected. It is possible.
- the shortcut icon may be continuously displayed.
- the control unit 10 displays control information for displaying the shortcut icon on the display panel 40.
- the data is output to the display control unit 19.
- the control unit 10 performs control so that the shortcut icon is continuously displayed on the display panel 40 even when the finger pressing force detected by the pressure-sensitive detection unit 18 by the user pulling the finger is less than the threshold th1.
- the control unit 10 performs control to hide the shortcut icon from the display panel 40. For this reason, the user can easily select another operation during the operation of selecting the shortcut icon.
- the shortcut icons may be sequentially switched and displayed, so that the target shortcut icon can be selected. For example, after a certain shortcut icon is displayed, when the user presses the finger again, the pressure-sensitive detection unit 18 detects that a predetermined time has elapsed for the finger pressing force to be equal to or greater than the threshold th1. Then, the control unit 10 displays another shortcut icon on the display panel 40. Thereafter, each time the user presses the finger on the touch panel 20 and repeats the operation of pulling the finger, the control unit 10 sequentially switches and displays different shortcut icons on the display panel 40. As a shortcut icon displayed on the display panel 40, one shortcut icon is cyclically selected from a plurality of shortcut icons.
- a plurality of icons having a shortcut function may be displayed in order according to the strength with which the user presses the touch panel 20 with a finger. For example, in the state where the map is displayed, a home icon may be displayed if the finger pressing force on the touch panel 20 is greater than or equal to the threshold th1, and a menu icon may be displayed if the finger pressing force is greater than or equal to the threshold th2. . For this reason, the user can select a target icon from a plurality of icons without moving the finger pressed into the touch panel 20.
- the user interface device 3 according to each embodiment described above may be applied to devices other than the navigation device. For example, operability can be improved by applying the user interface device 3 to a touch panel portion of a mobile terminal, a tablet terminal, or the like. In this way, the user interface device 3 according to the present embodiment can be combined with various electronic devices that require a touch panel.
- the user interface device 3 includes the control unit 10, but the navigation device body 2 may include the control unit 10.
- the touch panel 20 may have a configuration capable of detecting a touch operation by a method other than the capacitance method. Further, the pressure sensor 30 may detect that the touch panel 20 is pushed in by a press switch or the like provided below the touch panel 20.
- the operation combined with the pressure sensitive function in the user interface device 3 may be used for, for example, navigation of a person or a bicycle. Further, the user interface device 3 may be used for the operation of the image editing software as described above, or may be used for the operation of other application software.
- the navigation device body 2 and the user interface device 3 are combined. However, since the user interface device 3 itself has a navigation function, only the user interface device 3 may be used as the navigation device.
- the pressure-sensitive sensor 30 may be configured not to include the elastic body 33. For example, even if the elastic body 33 is removed from the pressure sensor 30, if a pressing force is applied to the pressure sensor 30 in a state where the upper electrode 31 and the lower electrode 35 are separated from each other with a constant distance, the upper electrode 31 is applied. The lower electrode 35 approaches and the electrostatic capacity between the upper electrode 31 and the lower electrode 35 decreases. For this reason, the pressure sensitive detection unit 18 can obtain the capacitance change rate based on the sensor values output from the upper electrode 31 and the lower electrode 35.
- the present invention is not limited to the above-described embodiments, and various other application examples and modifications can be taken without departing from the gist of the present invention described in the claims.
- the configuration of the apparatus is described in detail and specifically in order to explain the present invention in an easy-to-understand manner, and is not necessarily limited to the configuration having all the configurations described.
- the control lines and information lines indicate what is considered necessary for the explanation, and not all the control lines and information lines on the product are necessarily shown. Actually, it may be considered that almost all the components are connected to each other.
- SYMBOLS 1 ... Navigation apparatus, 2 ... Navigation apparatus main body, 3 ... User interface apparatus, 10 ... Control part, 17 ... Coordinate detection part, 18 ... Pressure detection part, 19 ... Display control part, 20 ... Touch panel, 30 ... Pressure sensor 40 ... Display panel
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Instructional Devices (AREA)
- Position Input By Displaying (AREA)
Abstract
L'invention concerne un dispositif d'interface utilisateur pourvu d'une unité de commande qui délivre, à une unité de commande d'affichage, des informations de commande pour afficher, sur un panneau d'affichage, une couche sélectionnée parmi une pluralité de couches associées à un objet sur la base d'une commande gestuelle, qui est une combinaison d'une commande tactile déterminée sur la base d'informations de détection de coordonnées et d'une commande de pression déterminée sur la base d'informations de détection sensibles à la pression. De plus, l'unité de commande détermine une commande tactile effectuée sur l'objet à travers la couche affichée sur le panneau d'affichage.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2016-254802 | 2016-12-28 | ||
| JP2016254802A JP2018106574A (ja) | 2016-12-28 | 2016-12-28 | ユーザーインターフェイス装置及び電子機器 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2018123355A1 true WO2018123355A1 (fr) | 2018-07-05 |
Family
ID=62707288
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2017/041783 Ceased WO2018123355A1 (fr) | 2016-12-28 | 2017-11-21 | Dispositif d'interface utilisateur et dispositif electronique |
Country Status (2)
| Country | Link |
|---|---|
| JP (1) | JP2018106574A (fr) |
| WO (1) | WO2018123355A1 (fr) |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2010032598A1 (fr) * | 2008-09-17 | 2010-03-25 | 日本電気株式会社 | Unité d'entrée, son procédé de commande et dispositif électronique comportant l'unité d'entrée |
| JP2011053831A (ja) * | 2009-08-31 | 2011-03-17 | Sony Corp | 情報処理装置、情報処理方法およびプログラム |
| JP2011530101A (ja) * | 2008-08-01 | 2011-12-15 | サムスン エレクトロニクス カンパニー リミテッド | ユーザインターフェースを実現する電子装置及びその方法 |
| WO2013157330A1 (fr) * | 2012-04-20 | 2013-10-24 | ソニー株式会社 | Dispositif de traitement d'informations, procédé de traitement d'informations et programme |
-
2016
- 2016-12-28 JP JP2016254802A patent/JP2018106574A/ja active Pending
-
2017
- 2017-11-21 WO PCT/JP2017/041783 patent/WO2018123355A1/fr not_active Ceased
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2011530101A (ja) * | 2008-08-01 | 2011-12-15 | サムスン エレクトロニクス カンパニー リミテッド | ユーザインターフェースを実現する電子装置及びその方法 |
| WO2010032598A1 (fr) * | 2008-09-17 | 2010-03-25 | 日本電気株式会社 | Unité d'entrée, son procédé de commande et dispositif électronique comportant l'unité d'entrée |
| JP2011053831A (ja) * | 2009-08-31 | 2011-03-17 | Sony Corp | 情報処理装置、情報処理方法およびプログラム |
| WO2013157330A1 (fr) * | 2012-04-20 | 2013-10-24 | ソニー株式会社 | Dispositif de traitement d'informations, procédé de traitement d'informations et programme |
Also Published As
| Publication number | Publication date |
|---|---|
| JP2018106574A (ja) | 2018-07-05 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN102112948B (zh) | 在便携终端中使用模式识别的用户界面装置和方法 | |
| JP4522475B1 (ja) | 操作入力装置、制御方法、およびプログラム | |
| JP4979600B2 (ja) | 携帯端末装置、及び表示制御方法 | |
| JP5640486B2 (ja) | 情報表示装置 | |
| WO2011105009A1 (fr) | Appareil électronique | |
| KR20110047349A (ko) | 휴대용 단말기에서 터치와 가압을 이용하는 사용자 인터페이스 장치 및 방법 | |
| CN114764304B (zh) | 一种屏幕显示方法 | |
| JP2010224658A (ja) | 操作入力装置 | |
| CN102741794A (zh) | 处理触觉输入 | |
| TW200937270A (en) | Touch sensor for a display screen of an electronic device | |
| CN106687905A (zh) | 触感控制系统及触感控制方法 | |
| JP2015172844A (ja) | 車両用操作装置 | |
| US20150253887A1 (en) | Information processing apparatus | |
| JP5594652B2 (ja) | 携帯情報端末及びそのキー配置変更方法 | |
| WO2018123320A1 (fr) | Dispositif d'interface utilisateur et appareil électronique | |
| JP5461030B2 (ja) | 入力装置 | |
| JP2012164047A (ja) | 情報処理装置 | |
| JP2016012313A (ja) | 操作装置 | |
| CN105677139B (zh) | 信息处理系统、信息处理装置、信息处理方法 | |
| KR20210018406A (ko) | 페이지 운용 방법 및 그 전자 장치 | |
| KR101678213B1 (ko) | 터치 영역 증감 검출에 의한 사용자 인터페이스 장치 및 그 제어 방법 | |
| WO2018123355A1 (fr) | Dispositif d'interface utilisateur et dispositif electronique | |
| CN114690888B (zh) | 一种应用界面的处理方法以及相关设备 | |
| JP6530160B2 (ja) | 電子機器 | |
| US9128559B2 (en) | Electronic device |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17886933 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 17886933 Country of ref document: EP Kind code of ref document: A1 |