WO2018167865A1 - Dispositif de détermination de geste tactile, procédé de détermination de geste tactile, programme de détermination de geste tactile et dispositif d'entrée de panneau tactile - Google Patents
Dispositif de détermination de geste tactile, procédé de détermination de geste tactile, programme de détermination de geste tactile et dispositif d'entrée de panneau tactile Download PDFInfo
- Publication number
- WO2018167865A1 WO2018167865A1 PCT/JP2017/010345 JP2017010345W WO2018167865A1 WO 2018167865 A1 WO2018167865 A1 WO 2018167865A1 JP 2017010345 W JP2017010345 W JP 2017010345W WO 2018167865 A1 WO2018167865 A1 WO 2018167865A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- gesture
- mode
- touch
- determination
- input
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
- G06F3/04166—Details of scanning methods, e.g. sampling time, grouping of sub areas or time sharing with display driving
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
- G06F3/0418—Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04847—Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04104—Multi-touch detection in digitiser, i.e. details about the simultaneous detection of a plurality of touching locations, e.g. multiple fingers or pen and finger
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04808—Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
Definitions
- the present invention relates to a touch gesture determination device, a touch gesture determination method, a touch gesture determination program, and a touch gesture operation for inputting operation information corresponding to a touch gesture operation and outputting a signal based on the input operation information. And a touch panel input device that outputs a signal based on a touch gesture operation.
- Patent Document 1 a gesture performed in a three-dimensional space is used to determine whether the gesture operation has been switched to the next gesture operation based on the relative movement from the timing at which the gesture operation has been switched, using the past gesture operation history. A method is disclosed.
- Patent Document 1 the state when the mode is switched to a certain mode is set as an initial state, and the rotation angle and movement amount from the state are determined. However, changing the threshold value from the initial state is described. Absent. For this reason, there is a problem that a gesture in another mode is erroneously recognized as a gesture in another mode during a gesture operation in a certain mode.
- the present invention has been made to solve the above-described problems, and prevents erroneous recognition as a gesture in another mode during a gesture operation in a certain mode, thereby facilitating an input operation in the gesture.
- the purpose is to do.
- a touch gesture determination apparatus receives an operation image from a touch panel that displays an operation image on a screen, receives a user's touch gesture operation, and outputs operation information corresponding to the touch gesture operation.
- a touch gesture determination device that generates output information based on the operation information, the touch gesture determination device including an operation determination unit that generates output information for display on the touch panel based on the operation information;
- a display control unit that receives the output information and causes the touch panel to display an image corresponding to the output information as the operation image, and the operation determination unit performs a touch performed on the entire screen of the touch panel.
- a whole screen input determination unit that determines the output information from operation information corresponding to a gesture operation, wherein the whole screen input determination unit determines whether the gesture mode of the touch gesture operation is a first gesture mode or a second gesture mode.
- a gesture mode determination unit that determines whether the mode is a gesture mode; a first parameter related to the first gesture mode; and a second parameter that is different from the first parameter regarding the second gesture mode.
- the gesture mode determination unit determines that the gesture mode of the touch gesture operation is the first gesture mode
- the gesture mode determination unit determines a deviation from the first gesture mode using the first parameter. Mode deviation determination is performed, and the gesture mode of the touch gesture operation is the second If it is determined that the gesture mode and performs the determined mode deviation determination deviation from the second gesture mode using the second parameters.
- the touch gesture determination method displays an operation image on a screen, accepts a user's touch gesture operation, and outputs the operation information from a touch panel that outputs operation information corresponding to the touch gesture operation.
- a touch gesture determination method for receiving and generating output information based on the operation information the touch gesture determination method including an operation determination step for generating output information for display on the touch panel based on the operation information;
- the operation determination step is performed on the entire screen of the touch panel.
- the entire screen input determining step for determining the output information from the operation information corresponding to the touch gesture operation performed in the touch gesture operation, wherein the gesture input mode of the touch gesture operation is a first gesture mode.
- a gesture mode determination step for determining whether the gesture mode is a second gesture mode, wherein the gesture mode determination step relates to a first parameter related to the first gesture mode and the second gesture mode, If it is determined from the second parameter that is different from the first parameter that the gesture mode of the touch gesture operation is the first gesture mode, the first gesture mode is used by using the first parameter.
- Mode deviation judgment is performed to judge deviation from When it is determined that the gesture mode of the gesture operation is the second gesture mode, mode deviation determination is performed using the second parameter to determine deviation from the second gesture mode. To do.
- a touch panel input device displays an operation image on a screen, receives a touch gesture operation of a user, outputs operation information corresponding to the touch gesture operation, and receives the operation information.
- a touch gesture determination apparatus that controls display of the touch panel based on the operation information, wherein the touch gesture determination apparatus selects for display of the touch panel based on the operation information.
- An operation determination unit that generates a value; and a display control unit that receives the selection value and causes the touch panel to display an image corresponding to the selection value as the operation image.
- the operation determination unit includes the touch panel Touch gesture operation performed on the display component displayed as the operation image
- An operation mode switching determination unit that switches between a display component input mode that identifies input content based on and a whole screen input mode that identifies input content based on a touch gesture operation performed on the entire screen of the touch panel;
- a display component input determination unit that determines the selection value from the operation information in the display component input mode; and a whole screen input determination unit that determines the selection value from the operation information in the entire screen input mode.
- the entire screen input determination unit relates to a gesture mode determination unit that determines whether the type of the gesture mode of the touch gesture operation is the first gesture mode or the second gesture mode, and the first gesture mode.
- the present invention during a gesture operation in a certain mode, it can be prevented from being mistakenly recognized as a gesture in another mode, and the input operation with the gesture can be facilitated.
- the touch panel input device includes a touch panel having a touch operation screen (operation screen) and a touch gesture determination device that receives operation information on the touch panel.
- the touch panel input device is mounted on the target device or connected to be communicable with the target device, the operation screen of the electric device as the target device, the operation screen of the camera as the target device, and the target device
- the present invention can be applied to an operation screen of factory equipment, an operation screen mounted on a car, a ship, an aircraft, or the like as a target device, an operation screen of a portable information terminal such as a smartphone or a tablet terminal as a target device.
- the touch panel input device uses a signal (for example, a selection value) based on operation information input by a touch gesture operation (also referred to as “touch operation”) from the operation screen of the touch panel as a target device equipped with the touch panel input device, or a touch panel. This can be provided to a target device that can communicate with the input device.
- a touch gesture operation also referred to as “touch operation”
- the touch panel is a touch gesture input unit that accepts a touch gesture operation performed by the user.
- the touch gesture operation is an information input operation by a specific movement such as the user's finger (or the user's palm, or the user's finger and palm).
- Touch gesture operation is tapping, which is an operation of tapping the operation screen of the touch panel with a finger, flicking, which is an operation of flicking the operation screen of the touch panel with a finger, and a swipe which is an operation of sliding the operation screen of the touch panel with a finger (sliding the finger).
- tapping which is an operation of tapping the operation screen of the touch panel with a finger
- flicking which is an operation of flicking the operation screen of the touch panel with a finger
- a swipe which is an operation of sliding the operation screen of the touch panel with a finger (sliding the finger).
- Touch gesture operations include dragging, which is an operation of dragging a display component on the touch panel with a finger, pinch-in, which is an operation of narrowing the interval between fingers while pinching with multiple fingers on the operation screen of the touch panel, and multiple touch gestures on the operation screen of the touch panel. It can include pinch out, which is an operation to increase the interval between fingers.
- the touch gesture operation can include a dial gesture that operates to rotate the dial by touching two or more points, a slider gesture that moves the finger to slide while keeping the finger in contact with the touch panel.
- the touch gesture operation can include an operation using a touch pen which is a pen-type input auxiliary tool.
- FIG. 1 is a functional block diagram showing a schematic configuration of a touch panel input device 100 according to an embodiment of the present invention.
- FIG. 2 is a functional block diagram illustrating a schematic configuration of the entire screen input determination unit 12b according to the embodiment.
- the touch panel input device 100 includes a touch gesture determination device 110 and a touch panel 120.
- the touch gesture determination device 110 is a device that can execute the touch gesture determination method according to the embodiment and the touch gesture determination program according to the embodiment.
- the touch panel 120 receives a touch gesture operation performed by a user and outputs operation information (also referred to as “touch information”) A0 corresponding to the touch gesture operation, and an operation panel.
- the display panel unit 121 is arranged so as to overlap the unit 122 and can display an operation image such as a GUI (Graphical User Interface) screen.
- the display panel unit 121 is, for example, a liquid crystal display.
- the touch gesture determination device 110 includes an operation information input unit 11, an operation determination unit 12, and a display control unit 14.
- the touch gesture determination device 110 may include a notification unit 13.
- the operation information input unit 11 receives operation information (operation signal) A0 output from the operation panel unit 122.
- the operation information input unit 11 outputs input information A1 corresponding to the received operation information A0 to the operation determination unit 12.
- the input information A1 is information corresponding to the operation information A0, and may be the same information as the operation information A0.
- the operation determination unit 12 receives the input information A1 from the operation information input unit 11, and outputs a selection value A2 as output information to the notification unit 13 and the display control unit 14.
- the selection value A2 output from the operation determination unit 12 is a selection value determined by the operation determination unit 12 based on the input information A1, and an application program for a device in which the touch panel input device 100 is mounted based on the selection value A2. Etc. perform device control and the like.
- the operation determination unit 12 determines the type and content of the touch gesture operation by the user from the received input information A1. As shown in FIG. 1, the operation determination unit 12 includes a display component input determination unit 12a, a whole screen input determination unit 12b, and an operation mode switching determination unit 12c.
- the operation mode switching determination unit 12c determines from the input information A1 received from the operation information input unit 11 whether the input information A1 includes information related to switching of the operation mode of the screen.
- the screen operation modes include a display component operation mode in which display components to be operated are displayed in the screen, and an entire screen operation mode in which the entire screen is set as an input range. These operation modes can be switched. .
- the operation mode switching determination unit 12c switches the operation mode from the determination result regarding the switching of the operation mode.
- the operation mode switching determination unit 12c switches the operation mode, the display content of the operation screen, the input method, and the notification method are switched.
- the entire screen input determination unit 12b identifies input contents from touch information for the entire screen, and is responsible for accepting operations and determining a selection value A2. As illustrated in FIG. 2, the entire screen input determination unit 12 b includes a parameter 111 during mode, a parameter 112 other than during mode, a gesture mode determination unit 113, and a gesture input information determination unit 114.
- the input content of touch information for the entire screen is identified by the parameter 111 during mode, the parameter 112 other than during mode, and the gesture mode determination unit 113.
- the mode parameter 111 stores parameters such as a movement amount and a touch time used during a certain mode.
- the mode parameter 111 a parameter related to the gesture mode is stored for each type of gesture mode.
- the parameter 111 during mode stores, for example, a parameter 111 a for the slider mode and a parameter 111 b for the dial mode.
- the slider mode parameter 111a and the dial mode parameter 111b may be any one or a combination of two or more of a movement amount, a touch gesture operation speed, a curvature, and a touch time in a predetermined time of the touch gesture operation. It is a parameter determined based on this.
- the parameter stored in the parameter 111 during the mode is not limited to the example shown in FIG.
- the parameters 112 other than in the mode store parameters such as a movement amount and a touch time used when not in any mode such as the first operation.
- the gesture mode determination unit 113 refers to the in-mode parameter 111 and the non-mode parameter 112 to identify the gesture mode of the input touch information.
- the gesture input information determination unit 114 determines the selection value A2 according to the value setting method in the gesture mode specified by the gesture mode determination unit 113.
- the display component input determination unit 12a identifies the input content from the touch information for the display component displayed on the screen, and is responsible for accepting the operation and determining the selection value A2.
- the notification unit 13 switches the operation screen notification method according to the determination result of the operation mode switching determination unit 12c, and receives the information of the selected value A2 determined by the operation determination unit 12, and notifies the operation status.
- the notification unit 13 issues a notification of notification contents according to the selection value A2 or outputs a notification signal.
- the notification unit 13 notifies the status of the user's touch gesture operation by, for example, sound, screen display, vibration by a vibrator, or lamp lighting.
- the notification unit 13 When the notification by the notification unit 13 is a notification by sound, the notification unit 13 outputs a notification signal to a speaker as an audio output unit.
- the speaker is shown in FIG.
- the notification unit 13 When the notification by the notification unit 13 is an image display, the notification unit 13 sends notification information A3 to the display control unit 14, and the display control unit 14 transmits an image signal A4 based on the notification information A3 to the display panel unit 121 of the touch panel 120. Send to.
- the display control unit 14 switches the display content of the operation screen according to the determination result of the operation mode switching determination unit 12c, receives the information of the selection value A2 determined by the operation determination unit 12, and reflects the operation result on the screen. As shown in FIG. 1, the display control unit 14 outputs an image signal A4 of an operation image displayed on the display panel unit 121 of the touch panel 120 to the display panel unit 121.
- FIG. 3 is a diagram illustrating an example of a hardware (H / W) configuration of the touch panel input device 100 according to the embodiment.
- the touch panel input device 100 includes a touch panel 120, a processor 301, a memory 302, and a speaker 303.
- a touch gesture determination device 110 shown in FIG. 1 includes a memory 302 as a storage device that stores a touch gesture determination program as software, and a processor 301 as an information processing unit that executes the touch gesture determination program stored in the memory 302. (For example, by a computer).
- the components 11 to 14 in FIG. 1 correspond to the processor 301 that executes the touch gesture determination program in FIG.
- a part of the touch gesture determination device 110 shown in FIG. 1 can also be realized by the memory 302 shown in FIG. 2 and the processor 301 that executes the touch gesture determination program.
- the touch panel 120 detects contact of a plurality of fingers and transmits touch information (identification number, coordinates, contact state of each finger) to the processor 301.
- the processor 301 stores the touch information acquired from the touch panel 120 in the memory 302, and switches the operation screen, the input method, and the notification method based on the touch history information accumulated in the memory 302.
- the processor 301 determines from the touch information stored in the memory 302 whether the operation is for the entire screen or an operation for the display component, and determines a selection value in each operation mode.
- the speaker 303 is a sound output unit used when, for example, a touch gesture operation status is notified by sound such as an announcement.
- the touch panel input device 100 may include an additional device such as a vibrator, a lamp, and a transmission device for wirelessly transmitting a notification signal instead of the speaker 303 or as an additional configuration.
- FIG. 4A and 4B are diagrams illustrating examples of the screen of the touch panel 120 and the touch gesture operation in the touch panel input device 100 according to the embodiment.
- FIG. 4A shows a screen example 401 displayed in the display component operation mode. As shown in FIG. 4A, the screen example 401 displays various parts to be operated, such as a button 402 for directly selecting a selection value. By touch-inputting the area where the display component is displayed, the target component can be operated.
- FIG. 4B shows an example screen 403 displayed in the entire screen operation mode.
- the screen example 403 shows a display 404 of the currently selected value, a display prompting an operation instruction for switching the screen (for example, a cancel area 405 described later), and the like. It is.
- information is input by the number of tap operations, a gesture that rotates the dial with two or more touches, and the like.
- FIG. 5 is a flowchart illustrating the operation (touch gesture determination method) of the touch gesture determination device 110 in the touch panel input device 100 according to the embodiment.
- the user activates the system in the display component operation mode and initializes each processing unit.
- the operation mode switching determination unit 12c determines switching input to the entire screen operation mode based on the touch information input from the operation information input unit 11.
- the operation mode is changed, and the display contents, input method, and notification method in the entire screen operation mode are switched.
- the gesture input information determination unit 114 identifies information related to the gesture input from the touch information input by the user, and thereby determines the selection value A2. For example, the user inputs information using a gesture (dial gesture) that rotates the dial with two or more touches.
- a gesture dial gesture
- An object of the touch gesture determination method is to prevent erroneous recognition as another gesture operation when a certain gesture operation is performed as described above.
- step ST405 by passing the selection value A2 determined in step ST404 to the operation screen display unit 109, the display content is switched and a screen reflecting the operation content is displayed.
- step ST406 the selection value A2 determined in step ST404 is passed to the notification unit 13, so that the notification content is switched, the notification of the situation after the operation result, and the announcement for prompting the operation are performed.
- the operation mode switching determination unit 12c determines a switching input to the display component operation mode from the touch information input from the operation information input unit 11 and the touch history information.
- the display component operation mode is entered, and the operation screen for the display component displayed at the initial activation is entered. If there is no switching input to the display component operation mode, touch input to the operation panel unit 122 is accepted until the switching input is accepted, and the display content and notification content are switched accordingly.
- step ST402 In the input determination regarding the screen switching in step ST402, if the input content is not switching to the entire screen operation mode (NO in ST402), input determination for the display component is performed in the next step ST709.
- ⁇ ⁇ Input input to the display component in the display component operation mode is determined based on whether touch input coordinates are included in the display component operation range.
- the selection value A2 is determined in the next step ST410.
- the touch input is accepted again in step ST402.
- the display component input determination unit 12a determines the selection value A2 based on the touch information input from the operation information input unit 11. As an example of selection, as shown in the display screen of each display component in FIG. 3, a display component assigned with a specific selection value such as a button is arranged, and the selection value is input by touch input to the display component. decide.
- step ST411 by passing the selection value A2 determined in step ST410 to the display control unit 14, the display content of the display panel unit 121 is switched, and a screen corresponding to the operation content is displayed.
- step ST412 by passing the selection value A2 determined in step ST410 to the notification unit 13, the notification content is switched, the notification of the situation after the operation result, and the announcement for prompting the operation are performed.
- FIG. 6 is a flowchart showing the gesture mode deviation determination operation by the gesture mode determination unit 113 in the embodiment.
- the gesture mode determination unit 113 determines whether or not it is in the gesture mode.
- the gesture mode determination unit 113 determines that the gesture mode is not in effect immediately after the gesture input information determination unit 114 determines the selection value A2, for example.
- the gesture mode determination unit 113 determines that the user is in the gesture mode, for example, when the user is scrolling on the touch panel or in the dial mode.
- step ST501 If it is determined in step ST501 that it is in the gesture mode (YES in ST501), in next step ST502, the gesture mode determination unit 113 performs the slider mode as the first gesture mode and the dial as the second gesture mode. It is determined which of the gesture mode or the touch gesture mode is being performed.
- step ST502 If it is determined in step ST502 that it is a slider gesture, the process proceeds to step ST503 to acquire a slider mode parameter as a first parameter. If it is determined in step ST502 that it is a dial gesture, the process proceeds to step ST504 to acquire a dial mode parameter as a second parameter. If it is determined in step ST502 that the gesture is a touch gesture, the process proceeds to step ST505, and a touch mode parameter corresponding to the touch gesture is acquired.
- the slider mode parameter, the dial mode parameter, and the touch mode parameter are any one or more of a movement amount at a predetermined time of the touch gesture operation, a speed of the touch gesture operation, a curvature, or a touch time. It is determined based on the combination.
- step ST506 After acquiring parameters corresponding to each gesture mode in steps ST503 to ST505, the process proceeds to the next step ST506, and it is determined whether a predetermined time has elapsed since the type of the gesture mode is determined in step ST502. If the predetermined time has elapsed (YES in step ST506), the process proceeds to the next step ST507, and using the acquired parameters in the gesture mode, mode deviation determination is performed to determine whether or not the gesture has deviated.
- step ST501 If it is determined in step ST501 that the gesture mode is not being performed (NO in ST501), the process proceeds to step ST508, and a parameter 112 other than the mode is acquired as a third parameter, and mode deviation determination is performed in the next step ST509. Do.
- the gesture mode determination unit 113 determines the type of the gesture mode being executed, and the parameter according to the determined gesture mode type. And mode deviation determination is performed using the acquired parameters. Accordingly, it is possible to prevent erroneous recognition as another mode during an operation in a certain mode, an input operation with a gesture is facilitated, and a user operability is improved.
- the mode is maintained for a predetermined time or more, and then the mode deviation is determined. Thereby, it can prevent deviating from the mode unintentionally during operation in a certain mode.
- Operation information input unit 12 Operation determination unit, 12a Display part input determination unit, 12b Whole screen input determination unit, 13 Notification unit, 14 Display control unit, 100 Touch panel input device, 110 Touch gesture determination device, Parameters in 111 mode , 111a slider mode parameter, 111b dial mode parameter, parameter other than 112 mode, 113 gesture mode determination unit, 114 gesture input information determination unit, 120 touch panel, 121 display panel unit, 122 operation panel unit, 301 processor, 302 Memory, 303 speaker, A0 operation information, 401,403 screen examples, 405 cancel area, A1 input information, A2 selected value, 3 notification information, A4 operation determination information.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
Abstract
Priority Applications (5)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2017/010345 WO2018167865A1 (fr) | 2017-03-15 | 2017-03-15 | Dispositif de détermination de geste tactile, procédé de détermination de geste tactile, programme de détermination de geste tactile et dispositif d'entrée de panneau tactile |
| KR1020197026428A KR102136526B1 (ko) | 2017-03-15 | 2017-03-15 | 터치 제스처 판정 장치, 터치 제스처 판정 방법, 터치 제스처 판정 프로그램, 및 터치 패널 입력 장치 |
| CN201780088102.5A CN110402427B (zh) | 2017-03-15 | 2017-03-15 | 触摸手势判定装置、触摸手势判定方法、记录有触摸手势判定程序的记录介质和触摸面板输入装置 |
| JP2017540659A JP6227213B1 (ja) | 2017-03-15 | 2017-03-15 | タッチジェスチャ判定装置、タッチジェスチャ判定方法、タッチジェスチャ判定プログラム、及びタッチパネル入力装置 |
| DE112017006949.1T DE112017006949T5 (de) | 2017-03-15 | 2017-03-15 | Berührungsgestenbeurteilungseinrichtung, Berührungsgestenbeurteilungsverfahren,Berührungsgestenbeurteilungsprogramm und Berührungsfeldeingabeeinrichtung |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2017/010345 WO2018167865A1 (fr) | 2017-03-15 | 2017-03-15 | Dispositif de détermination de geste tactile, procédé de détermination de geste tactile, programme de détermination de geste tactile et dispositif d'entrée de panneau tactile |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2018167865A1 true WO2018167865A1 (fr) | 2018-09-20 |
Family
ID=60265772
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2017/010345 Ceased WO2018167865A1 (fr) | 2017-03-15 | 2017-03-15 | Dispositif de détermination de geste tactile, procédé de détermination de geste tactile, programme de détermination de geste tactile et dispositif d'entrée de panneau tactile |
Country Status (5)
| Country | Link |
|---|---|
| JP (1) | JP6227213B1 (fr) |
| KR (1) | KR102136526B1 (fr) |
| CN (1) | CN110402427B (fr) |
| DE (1) | DE112017006949T5 (fr) |
| WO (1) | WO2018167865A1 (fr) |
Families Citing this family (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2023122459A (ja) * | 2022-02-22 | 2023-09-01 | ヤマハ株式会社 | 入力装置および電子楽器 |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2008070968A (ja) * | 2006-09-12 | 2008-03-27 | Funai Electric Co Ltd | 表示処理装置 |
| JP2016515741A (ja) * | 2013-04-15 | 2016-05-30 | マイクロソフト テクノロジー ライセンシング,エルエルシー | マルチフィンガータッチインタラクション中のパンおよびスケーリングの検出 |
| JP2016164702A (ja) * | 2015-03-06 | 2016-09-08 | 京セラドキュメントソリューションズ株式会社 | 表示入力装置、それを備えた画像形成装置、表示入力装置の制御方法およびプログラム |
Family Cites Families (13)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US8106856B2 (en) * | 2006-09-06 | 2012-01-31 | Apple Inc. | Portable electronic device for photo management |
| JPWO2010067537A1 (ja) * | 2008-12-08 | 2012-05-17 | シャープ株式会社 | 操作受付装置及びコンピュータプログラム |
| US8427440B2 (en) * | 2009-05-05 | 2013-04-23 | Microsoft Corporation | Contact grouping and gesture recognition for surface computing |
| JP5782431B2 (ja) | 2009-05-27 | 2015-09-24 | オブロング・インダストリーズ・インコーポレーテッド | 空間動作システムと共に用いるための空間マルチモード制御デバイス |
| CN102713822A (zh) * | 2010-06-16 | 2012-10-03 | 松下电器产业株式会社 | 信息输入装置、信息输入方法以及程序 |
| CN102694942B (zh) * | 2011-03-23 | 2015-07-15 | 株式会社东芝 | 图像处理装置、操作方法显示方法及画面显示方法 |
| JP6032420B2 (ja) * | 2012-04-02 | 2016-11-30 | カシオ計算機株式会社 | 印刷情報出力装置、印刷情報出力方法、プログラム及び携帯電子機器 |
| JP5852514B2 (ja) * | 2012-06-13 | 2016-02-03 | 株式会社東海理化電機製作所 | タッチセンサ |
| JP5772773B2 (ja) * | 2012-09-19 | 2015-09-02 | コニカミノルタ株式会社 | 画像処理装置、操作標準化方法および操作標準化プログラム |
| CN102890616B (zh) * | 2012-09-26 | 2016-03-30 | 杨生虎 | 触摸屏的快捷输入方法及系统 |
| JP5700020B2 (ja) * | 2012-10-10 | 2015-04-15 | コニカミノルタ株式会社 | 画像処理装置、プログラム及び操作イベント判別方法 |
| JP5862587B2 (ja) * | 2013-03-25 | 2016-02-16 | コニカミノルタ株式会社 | ジェスチャ判別装置、ジェスチャ判別方法、およびコンピュータプログラム |
| CN106095307B (zh) * | 2016-06-01 | 2019-05-31 | 努比亚技术有限公司 | 旋转手势识别装置及方法 |
-
2017
- 2017-03-15 JP JP2017540659A patent/JP6227213B1/ja active Active
- 2017-03-15 WO PCT/JP2017/010345 patent/WO2018167865A1/fr not_active Ceased
- 2017-03-15 CN CN201780088102.5A patent/CN110402427B/zh active Active
- 2017-03-15 KR KR1020197026428A patent/KR102136526B1/ko not_active Expired - Fee Related
- 2017-03-15 DE DE112017006949.1T patent/DE112017006949T5/de not_active Ceased
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2008070968A (ja) * | 2006-09-12 | 2008-03-27 | Funai Electric Co Ltd | 表示処理装置 |
| JP2016515741A (ja) * | 2013-04-15 | 2016-05-30 | マイクロソフト テクノロジー ライセンシング,エルエルシー | マルチフィンガータッチインタラクション中のパンおよびスケーリングの検出 |
| JP2016164702A (ja) * | 2015-03-06 | 2016-09-08 | 京セラドキュメントソリューションズ株式会社 | 表示入力装置、それを備えた画像形成装置、表示入力装置の制御方法およびプログラム |
Also Published As
| Publication number | Publication date |
|---|---|
| KR102136526B1 (ko) | 2020-07-22 |
| CN110402427A (zh) | 2019-11-01 |
| KR20190109551A (ko) | 2019-09-25 |
| JPWO2018167865A1 (ja) | 2019-03-22 |
| DE112017006949T5 (de) | 2019-10-31 |
| CN110402427B (zh) | 2022-05-31 |
| JP6227213B1 (ja) | 2017-11-08 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US8271908B2 (en) | Touch gestures for remote control operations | |
| US9612736B2 (en) | User interface method and apparatus using successive touches | |
| US20140223299A1 (en) | Gesture-based user interface method and apparatus | |
| JP5102412B1 (ja) | 情報端末、情報端末の制御方法、及び、プログラム | |
| US20150199125A1 (en) | Displaying an application image on two or more displays | |
| US10430071B2 (en) | Operation of a computing device functionality based on a determination of input means | |
| CN102681772A (zh) | 输入处理设备、输入处理方法和程序 | |
| WO2012039288A1 (fr) | Dispositif terminal d'informations et procédé d'affichage sur panneau tactile | |
| WO2014034369A1 (fr) | Dispositif de commande d'affichage, système client léger, procédé de commande d'affichage et support d'enregistrement | |
| JP6253861B1 (ja) | タッチジェスチャ判定装置、タッチジェスチャ判定方法、タッチジェスチャ判定プログラム、及びタッチパネル入力装置 | |
| JP6227213B1 (ja) | タッチジェスチャ判定装置、タッチジェスチャ判定方法、タッチジェスチャ判定プログラム、及びタッチパネル入力装置 | |
| KR101553119B1 (ko) | 연속적인 터치를 이용한 사용자 인터페이스 방법 및 장치 | |
| JP6737239B2 (ja) | 表示装置及び表示制御プログラム | |
| JP2016038619A (ja) | 携帯端末装置及びその操作方法 | |
| JP2020077181A (ja) | 表示制御装置、表示制御装置の制御方法及びプログラム | |
| JP6327834B2 (ja) | 操作表示装置、操作表示方法及びプログラム | |
| JP2012159981A (ja) | 表示制御装置及びその制御方法 | |
| US20150241982A1 (en) | Apparatus and method for processing user input | |
| JP6971573B2 (ja) | 電子機器、その制御方法およびプログラム | |
| JP2015158759A (ja) | 機能選択実行装置、機能選択実行方法及び機能選択実行プログラム | |
| KR100966848B1 (ko) | 회전식 직육면체 메뉴 바 디스플레이 방법 및 장치 | |
| KR101397907B1 (ko) | 멀티 터치 인식을 위한 시스템, 제어방법과, 기록 매체 | |
| KR101136327B1 (ko) | 휴대 단말기의 터치 및 커서 제어방법 및 이를 적용한 휴대 단말기 | |
| JP2025139885A (ja) | ユーザーインターフェース画面の選択方法、情報処理端末及びプログラム | |
| WO2013099042A1 (fr) | Terminal d'informations, procédé de commande d'un terminal d'informations et programme |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| ENP | Entry into the national phase |
Ref document number: 2017540659 Country of ref document: JP Kind code of ref document: A |
|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17901311 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 20197026428 Country of ref document: KR Kind code of ref document: A |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 17901311 Country of ref document: EP Kind code of ref document: A1 |