WO2012160829A1 - Dispositif à écran tactile, procédé d'entrée d'opération tactile et programme - Google Patents
Dispositif à écran tactile, procédé d'entrée d'opération tactile et programme Download PDFInfo
- Publication number
- WO2012160829A1 WO2012160829A1 PCT/JP2012/003407 JP2012003407W WO2012160829A1 WO 2012160829 A1 WO2012160829 A1 WO 2012160829A1 JP 2012003407 W JP2012003407 W JP 2012003407W WO 2012160829 A1 WO2012160829 A1 WO 2012160829A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- touch
- layer
- display
- screen device
- touch screen
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0485—Scrolling or panning
- G06F3/04855—Interaction with scrollbars
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04808—Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
Definitions
- the present invention relates to a touch screen device that performs an input instruction or the like by a touch operation, a touch operation input method, and a program.
- GUI graphical user interface
- Patent Document 3 Japanese Patent Document 3
- One of the problems is related to a scroll bar that represents the display position of the screen in the item list. If the number of items displayed on the screen is large, you have to slide and scroll the screen many times in order to move the list large, but use it by enabling the scroll operation by touching the scroll bar. It is thought that the ease is improved. However, in a small screen mobile device, it is necessary to realize an easy-to-use user interface (UI) in consideration of the list of thumbnails and the ease of operation.
- UI easy-to-use user interface
- thumbnails of photos when displaying thumbnails of photos as an item list, it is considered to display thumbnails and scroll bars in an overlapping manner in order to secure the number of displayed thumbnails and the display area of thumbnails preferentially and to make the screen easy to view.
- the user intends to touch the scroll bar to scroll, but the touch position may be applied to another area to scroll the entire screen or move the thumbnail, and may scroll or move against the user's intention. .
- switching of the operation target becomes complicated.
- thumbnail display size will be reduced. Conversely, if the thumbnail display size is maintained, the number of thumbnails that can be displayed simultaneously in one screen decreases, and the listability decreases.
- the present invention has been made in view of the above circumstances, and an object thereof is to ensure that the user can operate an operation target without erroneous operation even on a screen on which display objects are superimposed or closely displayed. It is in.
- Another object of the present invention is to improve an operation response when performing an operation on an operation target, switching of the operation target, and the like when there are a plurality of display objects of the operation target in the screen.
- the present invention is a touch screen device having a touch panel display capable of touch operation, and a display information holding unit holding information of display objects defined by being divided into a plurality of layers, and arranging the display objects for each layer And a display processing unit displayed on the screen of the touch panel display, a touch point detection unit detecting a touch point of the touch operation object on the touch panel display, and display of the operation target based on the number of detected touch points. And an operation input execution unit configured to select a layer of an object and execute an operation input operation according to a touch operation of the touch operation object on a display object of the operation target layer.
- the user can reliably operate the operation target without erroneous operation. Further, when there are a plurality of display objects of the operation target in the screen, it is possible to improve the operation response when performing the operation to the operation target, switching of the operation target, and the like.
- Block diagram showing the configuration of the touch screen device of the first embodiment A diagram showing a thumbnail list screen displayed on the touch panel display Diagram showing the layer hierarchy of the thumbnail list screen A table showing operation patterns for each layer of the thumbnail list screen As an operation of the first embodiment, a table showing a setting example in which the number of contact points of the finger is defined corresponding to each operation pattern shown in FIG. 4 (A), (B) shows a specific example of the operation on the screen in the first embodiment.
- Flow chart showing the operation procedure of the touch screen device in the first embodiment As an operation of the second embodiment, a table showing a setting example in which the number of contact points and the movement direction of the finger are defined corresponding to each operation pattern shown in FIG.
- FIG. 4 A diagram showing a specific example of the operation on the screen in the second embodiment Flow chart showing the operation procedure of the touch screen device in the second embodiment
- (B) shows a specific example of the operation on the screen in the third embodiment
- Flow chart showing the operation procedure of the touch screen device in the third embodiment The figure which shows the 2nd modification of the display mode of a screen
- the touch screen device, the touch operation input method, and the program according to the present invention will be described below.
- the touch screen device of the present embodiment is mounted on a small electronic device such as a mobile terminal device such as a mobile phone or a smartphone, or a mobile game device or a mobile audio device.
- FIG. 1 is a block diagram showing the configuration of the touch screen device of the first embodiment.
- the touch screen device 1 includes a control unit 10, a ROM 40, a RAM 50, a touch panel display 20, and a storage unit 30.
- the control unit 10 is configured by a computer including a known processor, and controls the entire touch screen device 1. Various data related to the operation of the touch screen device 1 are stored in advance in the ROM 40. Temporary data in each process of the control unit 10 is stored in the RAM 50.
- the touch panel display 20 includes a display device such as a liquid crystal display.
- the touch panel display 20 includes an operation detection unit 25 and a display processing unit 26.
- the operation detection unit 25 is formed of a touch panel with a planar operation input device arranged on the display device and can be operated by the user to detect touch, and detects coordinates and movement (gesture) of a touch point of the user's finger.
- the touch operation may be performed using another touch operation object such as a stylus pen.
- the display processing unit 26 performs display processing on the screen of the display device of the touch panel display 20.
- the storage unit 30 stores an operating system (OS) 38, a driver (software) 36 for causing the OS 38 to recognize peripheral devices, a content management unit 33, and an application program 31.
- the content management unit 33 registers and manages content data such as photos and documents.
- the application program 31 includes programs for executing various applications, such as a photo management program described later.
- the application program 31 includes information of various screens at the time of program execution.
- thumbnail list screen allows the user to list thumbnails indicating the content of each photo.
- FIG. 2 is a view showing a thumbnail list screen displayed on the touch panel display 20. As shown in FIG.
- the menu button 11, the back button 12, the entire scroll bar 13, the scroll operation bar 14, the thumbnail list base (base) 15 as a background image, and a plurality of thumbnails 16 are displayed on the thumbnail list screen 20a of the touch panel display 20 ing.
- the scroll bar 21 (see FIG. 3) is configured including the entire scroll bar 13 and the scroll operation bar 14.
- a thumbnail list 22 (see FIG. 3) is configured including the thumbnail list base 15 and the thumbnails 16.
- Display objects include items indicating operation objects such as thumbnails and icons such as photographs, GUI parts such as scroll bars and buttons, and windows to be operated to perform various displays according to the operation of the application.
- the storage unit 30 implements the function of the display information holding unit.
- the display processing unit 26 realizes a function of displaying on the screen of the touch panel display 20 display objects defined by being divided into a plurality of layers according to the control operation of the control unit 10.
- the operation detection unit 25 realizes the function of the touch point detection unit that detects the number of touch points on the touch panel display 20, the condition of the touch point, and the like according to the control operation of the control unit 10.
- the control unit 10 realizes the function of the operation input execution unit that executes the operation input operation according to the touch operation on the touch panel display 20.
- FIG. 3 is a diagram showing the layer hierarchy of the thumbnail list screen 20a.
- the present embodiment shows a case where a display object displayed on the screen of the touch panel display 20 is divided into five layer hierarchies of layer 0 to layer 4.
- a table representing the correspondence between image data of a display object displayed on the screen of the touch panel display 20 and a layer in which the display object is arranged is registered and stored.
- an application window 18 in which an execution screen of the application is displayed is arranged.
- a thumbnail list base 15 is disposed as a background of the item list.
- thumbnails 16 are arranged as items.
- the entire scroll bar 13 the menu button 11 and the back button 12 are disposed as fixed UI operation components (GUI components).
- a scroll operation bar 14 is disposed as a movable UI operation component.
- the division criteria of the layer are adapted to the appearance of the user in the screen display. That is, the application window is arranged in the lowermost layer, the component for UI operation is arranged in the uppermost layer, and the main information and the file on the screen are displayed between them (intermediate layer).
- the application window 18 for displaying a list of photos is arranged in the lowermost layer (layer 0), and the thumbnail list 22 is arranged thereon (middle layer). Furthermore, the scroll bar 21 or the like is disposed on the upper layer (uppermost layer) as a part for UI operation.
- thumbnail list 22 can be divided into a "thumbnail list base 15" which is a base (background) portion representing the entire thumbnail list and "thumbnails 16" which are individual thumbnails.
- the former is arranged in the lower layer (layer 1) and the latter is arranged in the upper layer (layer 2).
- the scroll bar 21 can also be divided into the "scroll operation bar 14" which is a bar for changing the scroll display position and the "scroll whole bar 13" which is a bar representing the entire scroll bar enabled area.
- the latter is placed in the lower layer (layer 3), and the former is placed in the upper layer (layer 4).
- the middle layer is divided and arranged into a plurality of layers by setting “the information display (image, text, etc.) portion of site + focus representing link position etc.” Can be displayed.
- FIG. 4 is a table showing operation patterns for each layer of the thumbnail list screen.
- patterns 1, 2 and 3 are mentioned as operation patterns.
- an operation as a UI and an operation of a display object of each layer are defined for each operation pattern.
- the layer of the display object to be operated is selected, and the operation input operation corresponding to the touch operation on the layer is performed.
- pattern 1 of the operation pattern in the operation as the UI, the operation of the list screen is performed.
- layer 1 moves as the main operation layer in accordance with the movement of the touch point.
- Layer 2 moves along with the main operation layer and moves in the same direction as layer 1.
- Layer 3 is fixed.
- Layer 4 moves along with the main operation layer and moves in the opposite direction to layer 1.
- pattern 2 of the operation pattern operation of the scroll bar is performed in the operation as the UI.
- the layer 4 moves as the main operation layer in accordance with the movement of the touch point.
- Layer 1 moves along with the main operation layer and moves in the opposite direction to layer 4.
- Layer 2 moves along with the main operation layer and moves in the opposite direction to layer 4.
- Layer 3 is fixed.
- pattern 3 of the operation pattern in the operation as the UI, the operation of the thumbnail is performed.
- layer 2 moves as the main operation layer in accordance with the movement of the touch point.
- Layer 1 performs movement associated with the main operation layer, and is fixed in the screen.
- Layer 3 is fixed.
- Layer 4 moves associated with the main operation layer, and is fixed in the screen.
- FIG. 5 is a table showing a setting example in which the number of contact points of the finger is defined corresponding to each operation pattern shown in FIG. 4 as the operation of the first embodiment.
- a plurality of operation patterns are switched according to the number of contact points of the finger (the number of fingers to be operated).
- an operation pattern is defined in accordance with the number of contact points of the finger detected by the operation detection unit 25 in contact with the touch panel display 20. That is, when the number of finger contacts is 1 (one-finger operation), the operation of pattern 1 is executed as an operation pattern. When the number of finger contacts is 2 (two-finger operation), the operation of pattern 2 is executed as an operation pattern. If the number of finger contacts is 3 (operation with three fingers), the operation of pattern 3 is executed as an operation pattern.
- FIG. 6 is a diagram showing a specific example of the operation on the screen in the first embodiment.
- FIG. 6A shows pattern 1 of the operation pattern
- FIG. 6B shows pattern 2 of the operation pattern.
- Pattern 1 is the operation of the list screen (item list). This pattern 1 is performed by the contact of one finger 60. As shown in FIG. 6A, when one finger 60 touches the thumbnail list base 15 arranged in layer 1 and moves it in the right direction of FIG. 6, the thumbnail is moved according to the movement of the finger. Along with the list base 15, the thumbnails 16 arranged in the layer 2 move in the right direction (the same direction). At this time, the scroll operation bar 14 disposed in the layer 4 moves in the left direction (reverse direction) as a motion associated with the layer 1. Note that the entire scroll bar 13 disposed in the layer 3 remains fixed.
- Pattern 2 is the operation of the scroll bar (GUI component). This pattern 2 is performed by the contact of two fingers 60. As shown in FIG. 6B, when the two fingers 60 move the two fingers 60 in the right direction of FIG. 6 by touching the scroll operation bar 14 disposed in the layer 4 with the two fingers 60, the scroll operation bar 14 moves to the right according to the movement of the finger. At this time, the thumbnail list base 15 disposed in layer 1 and the thumbnails 16 disposed in layer 2 move in the left direction (reverse direction) as the motion associated with layer 4. Note that the entire scroll bar 13 disposed in the layer 3 remains fixed.
- the pattern 3 is an operation for individual thumbnails. This pattern 3 is performed by three fingers.
- the thumbnail 16 arranged in the layer 2 is touched and moved with three fingers, the thumbnails 16 move in the same direction in accordance with the movement of the fingers.
- the thumbnail list base 15, the entire scroll bar 13, and the scroll operation bar 14 arranged on the layers 1, 3 and 4 remain fixed. is there.
- the thumbnail list base 15 and the scroll operation bar 14 move in the same direction as the movement associated with the layer 2. Note that the entire scroll bar 13 disposed in the layer 3 remains fixed.
- the display object is divided into a plurality of layers to define the operation of each layer, and the display object of each layer is arranged. Then, a plurality of operation patterns are switched according to the number of touch points of the finger touched by the user, and operations such as movement of each layer are performed according to the operation patterns.
- This makes it possible to clearly distinguish the operation of moving the thumbnail list as the operation target and the operation of moving the scroll bar as the operation target. Further, the operation of moving the entire item of the thumbnail and the operation of moving the individual item of the thumbnail can be clearly divided. Therefore, unintended erroneous operation can be eliminated.
- FIG. 7 is a flow chart showing an operation procedure of the touch screen device in the first embodiment.
- the program for executing the process of the touch screen device is included in a part of the photo management program, is stored as the application program 31 in the storage unit 30, and is repeatedly executed by the control unit 10.
- control unit 10 waits until the first touch point is detected by the operation detection unit 25 (step S1).
- the control unit 10 determines whether there is another contact point in contact simultaneously (step S2). If another touch point is detected, the control unit 10 determines whether the total number of touch points is 3 or more (step S3). When there are three or more touch points, the control unit 10 selects the layer 2 and determines whether or not an operation instruction such as movement of the touch point has been performed (step S4).
- the operation instruction in this case is a movement instruction of pattern 3.
- the processes of steps S1, S2, and S3 correspond to the function of the contact point detection unit.
- the control unit 10 determines a thumbnail to be an operation target from the touch point coordinates (step S5), and performs a movement operation on the thumbnail icon of pattern 3 (step S6). After this, the control unit 10 ends this operation.
- the thumbnail to be operated is determined from the number of touch points and the touch point coordinates, the operability is good.
- the processes of steps S5 and S6 correspond to the function of the operation input execution unit.
- step S4 when there is no operation instruction in step S4, the control unit 10 cancels the selection of layer 2 and determines whether one or more touch points have been canceled (step S7). If one or more points are released, the control unit 10 returns to the process of step S1. In addition, when not canceled, the control unit 10 returns to the process of step S4.
- control unit 10 selects layer 1 and determines whether or not there is an operation instruction such as movement of the contact point (step S8).
- the operation instruction in this case is the movement instruction of pattern 1.
- the control unit 10 cancels the selection of the layer 1 and returns to the process of step S2.
- the control unit 10 performs a scroll operation on the entire list screen of pattern 1 (step S9). After this, the control unit 10 ends this operation.
- step S10 When the total number of touch points is less than three in step S3, that is, when the total number of touch points is two, the control unit 10 selects layer 4 and determines whether or not there is an operation instruction. (Step S10).
- the operation instruction in this case is a movement instruction of pattern 2.
- step S11 the control unit 10 cancels the selection of the layer 4 and determines whether one touch point is released (step S11). If one point is released, the control unit 10 returns to the process of step S8. On the other hand, if not canceled, the control unit 10 returns to the process of step S3.
- step S10 determines a component (in this case, pattern 2, scroll operation bar 14) from the coordinates of the touch point (step S12). Then, the control unit 10 performs an operation on the scroll operation bar 14 of pattern 2 (step S13). After this, the control unit 10 ends this operation.
- a component in this case, pattern 2, scroll operation bar 14
- the user can reliably operate the operation target without erroneous operation.
- the contact time is not used for the determination of the operation switching, so that the operation response can be improved.
- the operation of the scroll operation bar and the scroll operation of the screen can be compatible without erroneous operation. Further, even in the case of a list in which a large number of items are displayed, it is possible to selectively use quick scrolling by the scroll operation bar and fine position adjustment by screen scrolling.
- the touch screen device of the second embodiment has the same configuration as that of the first embodiment, and thus the description thereof will be omitted by giving the same reference numerals. Here, an operation different from that of the first embodiment will be described.
- FIG. 8 is a table showing a setting example in which the number of contact points and the movement state of the finger are defined corresponding to each operation pattern shown in FIG. 4 as the operation of the second embodiment.
- a plurality of operation patterns are switched according to the number of contact points of the finger (the number of fingers to be operated) and the state of the contact point.
- the movement state (presence or absence of movement, movement direction) of each contact point is used as the state of the contact point.
- an operation pattern is defined according to the number of contact points of the finger detected by the operation detection unit 25 in contact with the touch panel display 20 and the moving direction of each contact point. That is, when the number of finger contacts is 1 (one-finger operation), the operation of pattern 1 is executed as an operation pattern. When the number of contact points of the finger is 2 (operation with two fingers) and the two contact points move in the same direction, the operation of pattern 2 is executed as the operation pattern. When the number of contact points of the finger is 2 (operation with two fingers) and one contact point is fixed and the other contact point is moved, the operation of pattern 3 is executed as an operation pattern.
- the pattern 3 different from the first embodiment will be described.
- FIG. 9 is a view showing a specific example of the operation on the screen in the second embodiment.
- FIG. 9 shows a pattern 3 of the operation pattern.
- the pattern 3 is a thumbnail-specific operation.
- an operation instruction is performed with two fingers. Fix one finger 60a in a contact state, contact the other finger 60b with the thumbnail 16 of "E14", and then slide this finger 60b to make a thumbnail of "E14" to be operated Only 16 move in the same direction.
- the thumbnail list base 15 the entire scroll bar 13, and the scroll operation bar 14 arranged on the layers 1, 3 and 4 remain fixed. is there.
- the thumbnail list base 15 and the scroll operation bar 14 move in the same direction as the movement associated with the layer 2.
- the scroll whole bar 13 arranged in the layer 3 remains fixed.
- thumbnail icon “move” by dragging, “open” by “flick”, “transfer” or the like may be performed.
- FIG. 10 is a flow chart showing an operation procedure of the touch screen device in the second embodiment.
- the program for executing the process of the touch screen device is included in a part of the photo management program, is stored as the application program 31 in the storage unit 30, and is repeatedly executed by the control unit 10.
- control unit 10 waits until the first touch point is detected by the operation detection unit 25 (step S21).
- the control unit 10 determines whether there is another contact point in contact simultaneously (step S22). When another contact point is detected, the control unit 10 determines whether or not there is a movement instruction at only one touch point (step S23). When there is a movement instruction for only one touch point, it is a movement instruction for pattern 3. In this case, the control unit 10 selects the layer 2, determines the thumbnail to be operated from the touch point coordinates (step S24), and performs a moving operation on the thumbnail icon of pattern 3 (step S25). After this, the control unit 10 ends this operation.
- control unit 10 selects layer 1 and determines whether or not there is an operation instruction (movement instruction) such as movement of the contact point (Ste S26). When there is no operation instruction, the control unit 10 cancels the selection of the layer 1 and returns to the process of step S22. On the other hand, when there is an operation instruction, the control unit 10 performs a scroll operation on the entire list screen of pattern 1 (step S27). After this, the control unit 10 ends this operation.
- movement instruction such as movement of the contact point
- control unit 10 determines whether or not both move instructions are simultaneously issued (step S28). When both movement instructions are given simultaneously, it is a movement instruction of pattern 2. In this case, the control unit 10 selects the layer 4 and determines the target component (here, the scroll operation bar 14) from the coordinates of the touch point (step S29). Then, the control unit 10 performs an operation on the scroll operation bar 14 of pattern 2 (step S30). After this, the control unit 10 ends this operation.
- target component here, the scroll operation bar 14
- step S31 determines whether or not the touch at one point is released. If not canceled, the control unit 10 returns to the process of step S23. On the other hand, when the contact at one point is released, the control unit 10 returns to the process of step S26.
- the second embodiment in the case of operating the pattern 3 as the operation pattern, one finger of the two fingers in contact is fixed and the other finger is moved.
- a plurality of operation patterns can be selectively switched according to the number of contact points and the movement state of each contact point. For example, as a layer of the display object to be operated, a layer in which the entire item is arranged and a layer in which the individual item is arranged can be selectively switched.
- the number of fingers can be reduced to two as compared with the first embodiment using three fingers. Therefore, the problem that the range of the finger to be touched (the area of the plurality of touch points) spreads and the thumbnail can not be specified well is resolved. Further, since only one finger is present on the thumbnail icon to be operated, the visibility of the icon can be secured.
- the touch screen device of the third embodiment has the same configuration as that of the first embodiment, and thus the description thereof will be omitted by giving the same reference numerals.
- the third embodiment shows a case where a window displayed on a touch panel display as a user interface (UI) is operated instead of the operation of the thumbnail list screen in the first and second embodiments.
- UI user interface
- FIG. 11 is a table showing a setting example in which an operation as a UI with respect to the number of touch points of a finger is defined as the operation of the third embodiment.
- an operation as a UI is defined in accordance with the number of contact points of the finger detected by the operation detection unit 25 in contact with the touch panel display 20. That is, when the number of finger contacts is 1 (one-finger operation), the operation on the upper layer (active) is performed. When the number of finger contacts is 2 (two-finger operation), the operation on the lower layer (inactive) is performed.
- FIG. 12 is a view showing a specific example of the operation on the screen in the third embodiment.
- the touch panel display 20 displays windows of two UIs.
- the first UI (G11) 71 is disposed in the lower layer, and the second UI (H12) 72 is disposed in the upper layer.
- the upper layer is in the active state (valid) and the lower layer is in the inactive state (invalid).
- FIG. 13 is a flowchart showing an operation procedure of the touch screen device in the third embodiment.
- a program for executing the process of the touch screen device is included in a program or the like executed under a multitasking environment, is stored as an application program 31 in the storage unit 30, and is repeatedly executed by the control unit 10. Ru.
- control unit 10 waits until the first touch point is detected by the operation detection unit 25 (step S41).
- control unit 10 determines whether there is a second contact point in contact at the same time (step S42). If there is the second touch point, the control unit 10 selects the lower layer and determines whether or not there is an operation instruction such as movement of the touch point (step S43).
- control unit 10 performs an operation on the inactive lower layer (step S44). After this, the control unit 10 ends this operation.
- step S42 the control unit 10 selects the upper layer and determines whether or not there is an operation instruction such as movement of the touch point (step S45). If there is no operation instruction, the control unit 10 cancels the selection of the upper layer, and returns to the process of step S42. On the other hand, when there is an operation instruction, the control unit 10 performs an operation on the active upper layer (step S46). After this, the control unit 10 ends this operation.
- step S43 when there is no operation instruction in step S43, the control unit 10 cancels the selection of the lower layer, and determines whether or not the contact of one point is canceled (step S47). If not released, the control unit 10 returns to the process of step S43. On the other hand, when the release is performed, the control unit 10 returns to the process of step S45.
- the layer (here, the window) of the display object to be operated can be selectively switched according to the number of touch points of the finger operated by the touch operation, and the operation such as movement can be performed.
- the operation on the UI of the inactive lower layer can be directly performed only by performing the touch operation with two fingers. This makes the operation extremely easy.
- two UI windows (display objects) are arranged in the upper and lower two layers, respectively, but three or more windows (display objects) are displayed in three or more layers. The same can be applied to the case where is arranged.
- display objects such as the thumbnail list base, the thumbnail, the entire scroll bar, the scroll operation bar, and the UI operation button are simply displayed on the screen of the touch panel display. .
- These display objects may be the display modes of the following modifications in order to display the arranged layers in an easy-to-understand manner to the user.
- display objects are not displayed in a superimposed state on the touch panel display at the same time, but are displayed sequentially for each layer with a time difference as in the flip flop cartoon.
- the display objects of the upper layer are displayed one by one with a time difference for each layer so that the display objects of the upper layer are sequentially superimposed one by one from the display object of the lower layer.
- FIG. 14 is a view showing a second modification of the display mode.
- the second modified example of the display mode is an example of a thumbnail list screen in which identification of layers is facilitated.
- display objects such as icons and parts are three-dimensionally displayed on a layer-by-layer basis in the thumbnail list screen.
- the user can easily recognize the distinction between the plurality of superimposed layers, the hierarchical relationship of layers, and the like.
- FIG. 15 is a view showing a third modification of the display mode.
- the third modified example of the display mode is another example of the thumbnail list screen in which identification of layers is facilitated.
- the display mode is changed and displayed on a layer basis in the thumbnail list screen.
- the frame line of the thumbnail 16 is a broken line
- the frame line of the entire scroll bar 13 is a solid line
- the frame line of the scroll operation bar 14 is displayed as a wide broken line. In this manner, by changing the line style of the frame line, the user can identify the layer in which the display object is arranged.
- FIG. 16 is a view showing a fourth modification of the display mode.
- the 4th modification of a display mode is an example of a guidance screen which displays explanation of an operation method.
- the guide display regarding the layer of the display object is arranged.
- the user can confirm the display object description, the operation method of the display object of each layer, and the like for each layer.
- guidance of the operation method is displayed on the touch panel display by performing a predetermined touch operation, for example.
- the user can easily understand the selection of the layer to be operated, and the usability is improved.
- the layer to be operated is selected according to the number of touch points, and the operation instruction is performed by changing the touch points on the display object arranged in the selected layer. Do.
- the user can reliably operate the operation target without erroneous operation.
- the contact time is not used for the determination of the operation switching, so that the operation response can be improved.
- the change of the touch point indicates the case where the finger operated by the touch operation is moved.
- the present invention is not limited to such an operation instruction. It is also possible to change the area to large or small.
- the layer 4 is selected as the operation target and one finger is selected. It is shown that the layer 2 is selected as the operation target when the finger is fixed and the other finger is moved.
- the layer to be operated may be switched between the case where two touching fingers are in proximity and the case where they are separated by a predetermined distance or more.
- the touch screen device of the present invention is not limited to portable terminal devices such as mobile phones and smart phones, and can be mounted on various electronic devices.
- the present invention is also applicable to a program for supplying a program for realizing each function of the above embodiment to a touch screen device via a network or various storage media, and a program read and executed by a computer of the touch screen device. .
- a touch screen device having a touch panel display capable of touch operation comprising: a display information holding unit holding information of display objects defined by dividing into a plurality of layers; and arranging the display objects for each layer Based on the number of the display processing unit displayed on the screen of the display, the touch point detection unit detecting the touch point of the touch operation object on the touch panel display, and the number of the touch points detected, the layer of the display object of the operation target
- a touch screen device comprising: an operation input execution unit which selects and executes an operation input operation according to a touch operation of the touch operation object on a display object of an operation target layer.
- display objects defined by being divided into a plurality of layers can be displayed, a layer of the display object to be operated can be selected based on the number of detected touch points, and an operation input operation can be performed. Therefore, even on the screen on which the display object is superimposed or closely displayed, the user can reliably operate the operation target without erroneous operation.
- the layer of the display object of the operation target is selected according to the number of touch points, the operation response when performing the operation to the operation target, switching of the operation target, and the like can be improved.
- the operation input execution unit selects a layer of a display object to be operated based on the number of touch points detected and the state of the touch points.
- the layer of the display object to be operated can be selected based on the number of detected touch points and the situation of the touch points, and the operation input operation can be performed.
- layers of multiple display objects can be selected with a smaller number of contact points.
- the operation input execution unit is an operation target based on at least one of presence / absence of movement of each contact point and movement direction as a state of the contact point, along with the number of the contact points.
- a touch screen device that selects the layer of display objects. With the above configuration, the layer of the display object to be operated can be selected and the operation input operation can be performed according to the number of touch points, the presence or absence of movement of each touch point, and the movement direction. In this case, various operation patterns can be selected and switched with a smaller number of contact points.
- the display object includes a GUI component and an item indicating an operation target
- the operation input execution unit is configured to arrange the GUI component as the operation target layer.
- a touch screen device that selectively switches between the layer in which the item is placed. According to the above configuration, it is possible to selectively switch between a GUI component such as a scroll bar and an item such as a thumbnail of an image and operate one display object as an operation target.
- the display object includes an item indicating an operation target
- the operation input execution unit sets, as the operation target layer, a layer in which the entire item of the item is arranged and the item
- a touch screen device that selectively switches between layers on which individual items are placed. According to the above configuration, it is possible to selectively switch between the entire item and the individual item, and operate one display object as an operation target.
- the display object includes a window to be operated, and the operation input execution unit selectively switches a plurality of layers in which the windows are arranged as the operation target layer.
- Touch screen device According to the above configuration, it is possible to selectively switch a plurality of windows and operate one display object as an operation target. In this case, it is possible to directly manipulate, for example, an inactive window superimposed and placed below.
- the touch screen device wherein the display processing unit displays the display object with a time difference for each layer.
- the touch screen device according to any one of the above, wherein the display processing unit three-dimensionally displays the display object when displaying the display object.
- the touch screen device according to any one of the above, wherein the display processing unit changes and displays a display mode according to a layer when the display object is displayed.
- the touch screen device according to any one of the above, wherein the display processing unit displays a guide display on a layer of the display object.
- a touch operation input method in a touch screen device having a touch panel display capable of a touch operation wherein display objects defined by being divided into a plurality of layers are arranged for each layer and displayed on the screen of the touch panel display And detecting the touch point of the touch operation object on the touch panel display, and selecting the layer of the display object of the operation target based on the number of the detected touch points, and selecting the display object of the operation target layer. And performing an operation input operation according to the touch operation of the touch operation object.
- a program that causes a computer to execute each step of the above touch operation input method is a program that causes a computer to execute each step of the above touch operation input method.
- the present invention has an effect that a user can reliably operate an operation target without erroneously operating a display object superimposed or closely arranged, and performs an input instruction or the like by a touch operation. It is useful as a touch operation input method and program.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
Abstract
L'invention porte sur un dispositif à écran tactile qui dispose des objets d'affichage, qui ont été partitionnés et définis dans une pluralité de couches, dans chaque couche et affiche les objets d'affichage sur un écran d'un dispositif d'affichage à panneau tactile (20). Dans la présente invention, le nombre de points de contact (le nombre de doigts (60) qu'un utilisateur utilise pour une opération tactile) est détecté et la couche d'un objet d'affichage à utiliser est sélectionnée sur la base du nombre de points de contact. Lorsqu'une opération de déplacement est réalisée avec un seul doigt, une base de liste de miniatures (15) dans la couche (1) et une miniature (16) dans la couche (2) se déplacent dans la même direction. Lorsqu'une opération de déplacement est réalisée avec deux doigts, la barre d'opération de défilement (14) dans la couche (4) se déplace, et la base de liste de miniatures (15) dans la couche (1) et la miniature (16) dans la couche (2) se déplacent dans la direction opposée.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2011-117219 | 2011-05-25 | ||
| JP2011117219A JP2012247861A (ja) | 2011-05-25 | 2011-05-25 | タッチスクリーン装置、タッチ操作入力方法及びプログラム |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2012160829A1 true WO2012160829A1 (fr) | 2012-11-29 |
Family
ID=47216919
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2012/003407 Ceased WO2012160829A1 (fr) | 2011-05-25 | 2012-05-24 | Dispositif à écran tactile, procédé d'entrée d'opération tactile et programme |
Country Status (2)
| Country | Link |
|---|---|
| JP (1) | JP2012247861A (fr) |
| WO (1) | WO2012160829A1 (fr) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN108628916A (zh) * | 2017-03-17 | 2018-10-09 | 株式会社Pfu | 缩略图显示装置、缩略图显示装置的控制方法以及存储介质 |
Families Citing this family (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR102133410B1 (ko) | 2013-01-31 | 2020-07-14 | 삼성전자 주식회사 | 멀티태스킹 운용 방법 및 이를 지원하는 단말기 |
| JP5805685B2 (ja) * | 2013-02-27 | 2015-11-04 | 京セラ株式会社 | 電子機器、制御方法、及び制御プログラム |
| JP2014174666A (ja) * | 2013-03-07 | 2014-09-22 | Ricoh Co Ltd | 画像処理システム、及び画像処理装置 |
| US9589539B2 (en) | 2014-04-24 | 2017-03-07 | Kabushiki Kaisha Toshiba | Electronic device, method, and computer program product |
| WO2016102296A2 (fr) * | 2014-12-22 | 2016-06-30 | Volkswagen Ag | Barre tactile et son application |
| JP2017157162A (ja) | 2016-03-04 | 2017-09-07 | パナソニックIpマネジメント株式会社 | 操作装置、照明システム及びプログラム |
| WO2017183652A1 (fr) | 2016-04-19 | 2017-10-26 | 日立マクセル株式会社 | Dispositif de terminal portable |
| CN108021258A (zh) * | 2016-10-31 | 2018-05-11 | 北京小米移动软件有限公司 | 终端设备误操作处理方法及装置 |
| JP7157104B2 (ja) * | 2020-06-25 | 2022-10-19 | マクセル株式会社 | 表示方法 |
| WO2023106606A1 (fr) | 2021-12-06 | 2023-06-15 | 삼성전자주식회사 | Serveur en nuage supportant une édition collaborative entre des dispositifs électroniques et son procédé de fonctionnement |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2003271432A (ja) * | 2002-03-19 | 2003-09-26 | Toshiba Corp | ファイル情報提示装置および方法およびプログラム |
| JP2008508600A (ja) * | 2004-07-30 | 2008-03-21 | アップル インコーポレイテッド | タッチ・センシティブ入力デバイスのためのモード・ベースのグラフィカル・ユーザ・インタフェース |
| WO2010008088A1 (fr) * | 2008-07-17 | 2010-01-21 | 日本電気株式会社 | Appareil de traitement de l’information, moyen de stockage sur lequel un programme a été enregistré et procédé de modification d’objet |
-
2011
- 2011-05-25 JP JP2011117219A patent/JP2012247861A/ja not_active Withdrawn
-
2012
- 2012-05-24 WO PCT/JP2012/003407 patent/WO2012160829A1/fr not_active Ceased
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2003271432A (ja) * | 2002-03-19 | 2003-09-26 | Toshiba Corp | ファイル情報提示装置および方法およびプログラム |
| JP2008508600A (ja) * | 2004-07-30 | 2008-03-21 | アップル インコーポレイテッド | タッチ・センシティブ入力デバイスのためのモード・ベースのグラフィカル・ユーザ・インタフェース |
| WO2010008088A1 (fr) * | 2008-07-17 | 2010-01-21 | 日本電気株式会社 | Appareil de traitement de l’information, moyen de stockage sur lequel un programme a été enregistré et procédé de modification d’objet |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN108628916A (zh) * | 2017-03-17 | 2018-10-09 | 株式会社Pfu | 缩略图显示装置、缩略图显示装置的控制方法以及存储介质 |
Also Published As
| Publication number | Publication date |
|---|---|
| JP2012247861A (ja) | 2012-12-13 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2012160829A1 (fr) | Dispositif à écran tactile, procédé d'entrée d'opération tactile et programme | |
| CN108509115B (zh) | 页操作方法及其电子装置 | |
| JP5691464B2 (ja) | 情報処理装置 | |
| US10168864B2 (en) | Gesture menu | |
| EP2472384B1 (fr) | Modèle d'événement tactile | |
| US20090109187A1 (en) | Information processing apparatus, launcher, activation control method and computer program product | |
| CN104520798B (zh) | 便携电子设备及其控制方法和程序 | |
| US9280265B2 (en) | Input control device, input control method, and input control program for controlling display target upon receiving input on display screen of display device | |
| US20160062467A1 (en) | Touch screen control | |
| US20110283212A1 (en) | User Interface | |
| US20130100051A1 (en) | Input control device, input control method, and input control program for controlling display target upon receiving input on display screen of display device | |
| JP5780438B2 (ja) | 電子機器、位置指定方法及びプログラム | |
| JP5848732B2 (ja) | 情報処理装置 | |
| JPWO2010032354A1 (ja) | 画像オブジェクト制御システム、画像オブジェクト制御方法およびプログラム | |
| JP6458751B2 (ja) | 表示制御装置 | |
| US20150169122A1 (en) | Method for operating a multi-touch-capable display and device having a multi-touch-capable display | |
| CN103324389A (zh) | 智能终端应用程序的操作方法 | |
| US20130100050A1 (en) | Input control device, input control method, and input control program for controlling display target upon receiving input on display screen of display device | |
| JP2014203202A (ja) | 情報処理装置、情報処理装置の制御方法、およびプログラム | |
| KR101260016B1 (ko) | 스킨형 인터페이스를 이용한 포인터 인터페이스 구현 방법 및 이를 구현하는 터치스크린 기기 | |
| KR20150098366A (ko) | 가상 터치패드 조작방법 및 이를 수행하는 단말기 | |
| JP6112147B2 (ja) | 電子機器、及び位置指定方法 | |
| JP2023125444A (ja) | 情報処理装置、及び情報処理方法 | |
| JP2015072561A (ja) | 情報処理装置 | |
| KR20150107255A (ko) | 이동통신 단말기의 사용자 인터페이스 제공 방법 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12789160 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 12789160 Country of ref document: EP Kind code of ref document: A1 |