[go: up one dir, main page]

WO2017154119A1 - Dispositif de commande d'affichage, procédé de commande d'affichage et programme de commande d'affichage - Google Patents

Dispositif de commande d'affichage, procédé de commande d'affichage et programme de commande d'affichage Download PDF

Info

Publication number
WO2017154119A1
WO2017154119A1 PCT/JP2016/057240 JP2016057240W WO2017154119A1 WO 2017154119 A1 WO2017154119 A1 WO 2017154119A1 JP 2016057240 W JP2016057240 W JP 2016057240W WO 2017154119 A1 WO2017154119 A1 WO 2017154119A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
indicator
information
display information
icon
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2016/057240
Other languages
English (en)
Japanese (ja)
Inventor
桃野 一世
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujitsu Ltd
Original Assignee
Fujitsu Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujitsu Ltd filed Critical Fujitsu Ltd
Priority to JP2018503902A priority Critical patent/JPWO2017154119A1/ja
Priority to PCT/JP2016/057240 priority patent/WO2017154119A1/fr
Publication of WO2017154119A1 publication Critical patent/WO2017154119A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • the present invention relates to a display control device, a display control method, and a display control program.
  • information processing terminals such as smartphones having a touch panel have been rapidly spread, and the user group has been widened from young people to elderly people. Among them, there are many users who are not used to operating the touch panel.
  • Such an information processing terminal accepts an operation on an icon and executes various processes. For example, the information processing terminal activates the corresponding application when an icon displayed on the touch panel is selected.
  • a technique for receiving a gesture with a finger on a touch panel and executing a process according to the received gesture is known.
  • a technique is known in which an icon on the touch panel is selected to be pushed in with a finger, and the selected icon is arbitrarily moved on the touch panel.
  • the operability associated with icon operation is not good. For example, in a state where the icon is selected to be pushed in, the icon and the finger overlap each other, so that the icon is difficult to see during the selection. Especially for elderly people, since the icons themselves are small, it is difficult to confirm whether they have been selected correctly.
  • An object of one aspect is to provide a display control device, a display control method, and a display control program that can improve operability associated with icon operation.
  • the display control device has a processor and a memory.
  • the processor receives selection of display information to be moved by an indicator via a touch panel for display information displayed on the display unit, and moves the indicator based on contact between the touch panel and the indicator. Predict directions.
  • the processor specifies an area where the display information and the indicator do not overlap based on the predicted movement direction of the indicator, and displays alternative information regarding the display information in the specified area.
  • the operability associated with icon operation can be improved.
  • FIG. 1 is a schematic diagram illustrating an information processing terminal according to the first embodiment.
  • FIG. 2 is a schematic diagram illustrating an example of a hardware configuration of the information processing terminal according to the first embodiment.
  • FIG. 3 is a functional block diagram of a functional configuration example of the information processing terminal according to the first embodiment.
  • FIG. 4 is a diagram illustrating an example of information stored in the direction pattern DB.
  • FIG. 5 is a diagram illustrating the estimation of the operating hand.
  • FIG. 6 is a diagram for explaining prediction of the finger movement direction.
  • FIG. 7 is a diagram for explaining a search pattern.
  • FIG. 8 is a flowchart showing the flow of processing.
  • FIG. 9 is a flowchart showing the flow of the finger direction prediction process.
  • FIG. 10 is a diagram illustrating a display example of a balloon-type icon.
  • FIG. 11 is a diagram illustrating a display example of a transmissive icon.
  • FIG. 1 is a diagram illustrating the information processing terminal 10 according to the first embodiment.
  • the information processing terminal 10 illustrated in FIG. 1 is an example of a terminal having a touch panel such as a smartphone, a mobile phone, or a tablet terminal.
  • the information processing terminal 10 displays an application icon on a display screen such as an LCD (Liquid Crystal Display) screen.
  • the information processing terminal 10 has a touch panel 10a superimposed on the LCD screen, and accepts an icon selection on the touch panel 10a.
  • the major axis direction is the Y axis (vertical axis)
  • the minor axis direction is the X axis (horizontal axis)
  • the home button of the information processing terminal is moved to the left or In the right state
  • the long axis direction is the X axis (horizontal axis)
  • the short axis direction is the Y axis (vertical axis).
  • the information processing terminal 10 when the information processing terminal 10 accepts selection of an icon on the touch panel 10a, the information processing terminal 10 executes an application corresponding to the selected icon. For example, when the information processing terminal 10 detects a finger touching the touch panel 10a, the information processing terminal 10 specifies an icon displayed at a position where the finger touches. In this way, the information processing terminal 10 detects icon selection by an indicator such as a finger. In addition, when the information processing terminal 10 detects a long press on an icon on the touch panel 10a, the information processing terminal 10 can select the long pressed icon to be in a selected state and accept the movement of the icon.
  • Such an information processing terminal 10 has a processor and a memory.
  • the processor of the information processing terminal 10 accepts selection of display information to be moved by a finger via the touch panel 10a for display information such as an icon displayed on the LCD screen.
  • the processor of the information processing terminal 10 predicts the moving direction of the selected icon based on the touch of the touch panel 10a and the finger. Thereafter, the processor of the information processing terminal 10 specifies an area where the icon and the finger do not overlap based on the predicted moving direction of the icon, and displays alternative information regarding the icon in the specified area.
  • the information processing terminal 10 has proximity sensors 10b and 10c on the left and right sides of the screen, and icons T to A are displayed on the LCD screen.
  • the proximity sensor 10c detects a hand
  • the information processing terminal 10 determines that the operation is performed with the right hand.
  • the information processing terminal 10 detects the selection state of the icon J, it will determine with the movement to the right direction from the contact state of a finger
  • the information processing terminal 10 can display the selected icon in a region where the icon and the finger do not overlap, and can suppress the selection of the selected icon from being difficult to see while moving the icon. As a result, the information processing terminal 10 can improve the operability associated with the icon operation.
  • FIG. 2 is a schematic diagram illustrating an example of a hardware configuration of the information processing terminal 10 according to the first embodiment.
  • the information processing terminal 10 includes a wireless unit 1, a proximity sensor 2, an audio input / output unit 3, a memory 4, a touch panel 5, an LCD screen 6, and a processor 20.
  • other hardware such as an acceleration sensor may be included.
  • the wireless unit 1 performs wireless communication with the base station and other information processing terminals via the antenna 1a.
  • the proximity sensor 2 corresponds to the proximity sensors 10b and 10c shown in FIG. 1, and detects a hand operating the touch panel.
  • a plurality of proximity sensors 2 are installed adjacent to the touch panel 10a in order to detect right hand or left hand. For example, when the touch panel 10a is operated with the left hand, the palm of the left hand passes or covers, and when the touch panel 10a is operated with the right hand, the palm of the right hand passes or covers. .
  • the audio input / output unit 3 outputs sound from the speaker and executes various processes on the sound collected by the microphone.
  • the memory 4 is an example of a storage device, and stores programs, data, and the like, for example.
  • the touch panel 5 is an input unit that receives a user operation superimposed on the LCD screen 6, and outputs the operated position (coordinates) to the processor 20.
  • the touch panel 5 corresponds to the touch panel 10a shown in FIG. 1, and various methods such as a capacitance method and an electromagnetic induction method can be adopted.
  • the LCD screen 6 is an example of a display unit that displays various types of information.
  • the processor 20 is a processing unit that controls the processing of the entire information processing terminal 10, and is, for example, a CPU (Central Processing Unit).
  • the processor 20 executes an OS (Operating System).
  • the processor 20 reads out a program stored in the hard disk or the like, develops it in the memory 4 and executes it, thereby executing each functional unit described later.
  • FIG. 3 is a functional block diagram of a functional configuration example of the information processing terminal 10 according to the first embodiment.
  • the information processing terminal 10 includes a direction pattern DB 21 and a search pattern DB 22, and the processor 20 includes a touch panel control unit 23, a selection receiving unit 24, an estimation unit 25, a prediction unit 26, and a display control unit. 27.
  • the direction pattern DB 21 and the search pattern DB 22 are stored in a storage device such as a hard disk or a memory.
  • the touch panel control unit 23, the selection receiving unit 24, the estimation unit 25, the prediction unit 26, and the display control unit 27 are examples of electronic circuits that the processor 20 has and examples of processes that the processor 20 executes.
  • the direction pattern DB 21 is information for determining whether the operating hand is a right hand or a left hand, and is a database that stores a contact pattern between a finger and the touch panel 5. Whether the right hand or the left hand is determined based on the information stored here and the actual contact position between the finger and the touch panel.
  • FIG. 4 is a diagram illustrating an example of information stored in the direction pattern DB 21.
  • the direction pattern DB 21 stores a pattern and a determination result in association with each other.
  • the right-handed (right hand) To be judged.
  • the pattern 4 is in contact with the upper right and the lower left
  • the pattern 5 is in contact with other than the lower right
  • the pattern 6 is in contact with other than the upper left, it is determined to be left-handed (left hand). Is done.
  • the search pattern DB 22 is a database that stores a search pattern for searching an area for displaying a balloon-type icon from the LCD screen. Although details will be described later, the search pattern DB 22 stores a search pattern associated with the moving direction of the finger for each of the right hand and the left hand.
  • the touch panel control unit 23 is a processing unit that executes various processes in association with operations on the touch panel 5 and information displayed on the LCD screen. Specifically, the touch panel control unit 23 performs processing similar to general touch panel control, and accepts selection of icons displayed on the LCD screen on the touch panel 5. For example, the touch panel control unit 23 manages the coordinates on the touch panel 5 in association with the coordinates and pixels on the LCD screen, and detects which icon has been selected. The touch panel control unit 23 outputs the selected position information, coordinate information, icon information, and the like to the selection receiving unit 24, the estimation unit 25, and the prediction unit 26.
  • the selection receiving unit 24 is a processing unit that receives selection of an icon on the touch panel 5 and outputs the received information to the display control unit 27. For example, the selection receiving unit 24 receives an icon corresponding to the touched position as a selected state on the touch panel 5 for a certain time (for example, 2 seconds or more). The icon in the selected state moves in conjunction with the finger until the finger is in a non-contact state, and moves to a position where the finger is in a non-contact state.
  • the estimation unit 25 is a processing unit that estimates whether the finger whose icon has been selected is the right hand or the left hand and outputs the estimation result to the display control unit 27. For example, when the hand is detected by the proximity sensor 10b, the estimation unit 25 estimates the left hand, and when the hand is detected by the proximity sensor 10c, the estimation unit 25 estimates the right hand. In addition, when the dominant hand information is set in advance by the user, the estimation unit 25 can prioritize the setting information.
  • FIG. 5 is a diagram illustrating the estimation of the operating hand.
  • the touch panel 5 is divided into six regions 1 to 6 in the X-axis direction and divided into six regions a to f in the Y-axis direction in accordance with the division of the LCD screen.
  • the position of the axis and the position of the Y axis one region is determined.
  • “3a” or the like is described to indicate a region of 3 on the X axis and a on the Y axis.
  • the estimation unit 25 acquires the capacitance value, the pressure value, and the like of the touch panel 5 via the touch panel control unit 23, and “3c” and “4d” of the touch panel 5 are values equal to or greater than a certain value. Thus, “3c” and “4d” are determined as contact areas. And the estimation part 25 estimates that it is right-handed (right hand) as a result of having determined that the pattern of a contact area corresponds to the pattern 1 with reference to direction pattern DB21. Thereafter, the estimation unit 25 outputs the estimation result to the display control unit 27.
  • the prediction unit 26 is a processing unit that predicts the moving direction of a finger that is selecting an icon and outputs a prediction result to the display control unit 27. Specifically, when the contact surface moves within a predetermined time, for example, within 2 ms after the handedness is estimated by the estimation unit 25, the prediction unit 26, based on the region of the contact surface, the amount of movement, and the like, Predict the direction of movement.
  • FIG. 6 is a diagram for explaining prediction of the moving direction of the finger.
  • the touch panel 5 shown in FIG. 6 is divided into six regions 1 to 6 in the X-axis direction and six regions a to f in the Y-axis direction in accordance with the division of the LCD screen, as in FIG. By specifying the position of the X axis and the position of the Y axis, one region is determined. Note that the area of the LCD screen and the area of the touch panel 5 are synchronized and indicate the same area.
  • the finger 10d and the touch panel 5 are in contact with each other on the contact surface Q2. That is, “3c” and “4d” are specified as contact areas.
  • the prediction unit 26 acquires “4c” and “5d” as the contact area between the touch panel 5 and the finger via the touch panel control unit 23.
  • the prediction unit 26 pays attention to the contact area P2 “3c” of the estimation point by the estimation unit 25, and the contact area moves from P2 to P3 because the contact area P3 “4c” is set at the next point.
  • the predicting unit 26 determines that the touch position has moved in the plus direction of the X axis, that is, in the right direction because the contact area has moved by 1 in the X axis direction and 0 in the Y axis direction. As a result, the prediction unit 26 predicts the finger moving direction as the right direction.
  • the prediction unit 26 predicts that the finger moves in the left direction, and the contact area is 1 in the X-axis direction and the Y-axis When moving in the direction by -3, the moving direction of the finger is predicted as the downward direction. In addition, when the prediction unit 26 moves in both the X-axis direction and the Y-axis direction, the prediction unit 26 selects the one having the larger movement amount. When there are a plurality of estimated points by the estimation unit 25, paying attention to an arbitrary region, the same processing as the above processing is performed.
  • the display control unit 27 is a processing unit that includes a vacant detection unit 28 and a display output unit 29, and displays an alternative icon of the selected movement target icon in an area where no other icon is displayed.
  • the vacancy detection unit 28 is a processing unit that identifies an area where the selected icon and the finger 10d do not overlap based on the movement direction of the finger 10d predicted by the prediction unit 26. Specifically, the vacancy detection unit 28 acquires the movement direction of the finger 10d from the prediction unit 26, and specifies a search pattern corresponding to the acquired movement direction from the search pattern DB 22. Then, the free space detector 28 searches for free space from the LCD screen according to the specified search pattern.
  • FIG. 7 is a diagram for explaining a search pattern.
  • a right-handed example will be described.
  • the search patterns are classified into eight types.
  • the touch panel 5 is divided into seven areas 1 to 7 in the X-axis direction and divided into seven areas a to g in the Y-axis direction in accordance with the division of the LCD screen. This will be explained with an example. Note that the number of search pattern classifications and the number of divisions of the LCD screen are not limited to this, and can be arbitrarily subdivided.
  • the vacancy detection unit 28 may select “1a to 7a, 1b to 6b, 1c to 5c” as areas that do not overlap with the finger, as shown in FIG. 27 areas of 1d to 3d, 1e to 3e, 1f to 2f, and 1g "are specified.
  • the vacancy detection unit 28 sets “1a to 7a, 1b to 7b, 1c to 7c” as areas that do not overlap with the finger as shown in FIG. 33 regions of 1d to 3d, 1e to 3e, 1f to 3f, 1g to 3g "are specified.
  • the vacancy detection unit 28 specifies 33 areas similar to (2) as shown in (3) of FIG.
  • the vacancy detection unit 28 determines that “1a to 7a, 1b to 7b, 1c to 7c” as areas not overlapping with the finger, as illustrated in FIG. , 6d to 7d, 7e "are specified.
  • the vacancy detection unit 28 determines that “1a to 5a, 1b to 4b, 1c to 3c, 1d” as areas not overlapping with the finger, as shown in (5) of FIG. To 2d, 1e to 2e, 1f to 3f, 1g to 4g ".
  • the vacancy detection unit 28 identifies 33 areas similar to (2) as shown in (6) of FIG.
  • the vacancy detection unit 28 determines that “1a to 7a, 1b to 7b, 1c to 3c” as areas not overlapping with the finger, as shown in (7) of FIG. 26 areas from 5c to 7c, 1d to 2d, 6d to 7d, 1e, and 7e "are specified.
  • the vacant detection unit 28 determines that “1a to 7a, 1b to 7b, 1c to 7c” as areas not overlapping with the finger, as illustrated in FIG. 27 areas of 1d to 3d, 1e to 2e, and 1f ”are specified.
  • the vacancy detection unit 28 determines whether an icon is displayed in the corresponding area, and outputs the determination result to the display output unit 29. For example, when the movement direction of the finger 10d is upward, the vacancy detection unit 28 specifies the search pattern (1). Then, the vacancy detection unit 28 sequentially searches 27 areas “1a to 7a, 1b to 6b, 1c to 5c, 1d to 3d, 1e to 3e, 1f to 2f, 1g”, and “4a, 3b, 4b, 1c, 2e "is specified. Then, the vacancy detection unit 28 notifies the display output unit 29 of “4a, 3b, 4b, 1c, 2e” as the area where the icon exists, and notifies the display output unit 29 that no other icons exist. Notice.
  • the display output unit 29 is a processing unit that displays a substitute icon for the currently selected icon in the empty area specified by the empty detection unit 28.
  • the display output unit 29 is located in an area other than “4a, 3b, 4b, 1c, 2e” in an area where the distance between the contact surface (area) between the finger 10d and the touch panel 5 is the closest. Display alternate icons.
  • the display output unit 29 displays a balloon-type icon in the “3d” area.
  • Various known methods can be used for the distance measurement. For example, the distance can be determined by calculating a linear distance to the contact surface (region), the movement amount of the X axis and the Y axis, and the like. .
  • the display output unit 29 can dynamically change the size of the balloon-type icon according to the size of the nearest empty area. For example, in (1) of FIG. 7, the display output unit 29 has “2c, 3c, 2d, 3d” adjacent to the nearest empty area “3d” because the area “2c, 3c, 2d” is also empty. ”Is displayed as a balloon-type icon having a size extending over the four areas. The display output unit 29 also has a vertically long balloon type so as to straddle the two areas “3c, 3d” when the area “3c” adjacent to the nearest empty area “3d” is also empty. Display icon.
  • FIG. 8 is a flowchart showing the flow of processing.
  • the selection receiving unit 24 detects the movement of the icon via the OS function or the like (S101: Yes)
  • the selection receiving unit 24 determines whether the icon is moving (S102).
  • the prediction unit 26 executes the direction prediction process for the finger 10d (S103). For example, when the selection receiving unit 24 is in a selection state in which the icon is movable, the selection receiving unit 24 performs a direction prediction process for the finger 10d.
  • the vacancy detection unit 28 acquires the current touch position (S104), acquires the moving direction of the finger 10d (S105), and specifies the detection pattern according to the moving direction of the finger. (S106).
  • the empty detection unit 28 detects an empty area according to the detection pattern (S107), and the display output unit 29 specifies the empty area closest to the touch position (S108). Thereafter, the display output unit 29 displays a balloon-type icon in the identified area (S109).
  • FIG. 9 is a flowchart showing the flow of the finger 10d direction prediction process. As illustrated in FIG. 9, when the prediction unit 26 detects that the finger 10d is in contact with the touch panel 5 (S201: Yes), the prediction unit 26 determines whether or not it is the first contact (S202).
  • the prediction unit 26 records the first contact position (S203).
  • the prediction unit 26 determines whether the contact is within a predetermined time from the previous contact (S204). And the prediction part 26 complete
  • the prediction unit 26 records the position of the local point (S205), and calculates the difference from the initial contact position to the local point.
  • the direction of the finger 10d is specified (S206).
  • the information processing terminal 10 displays a balloon-type icon having the same design as the selected icon around the position of the icon selected with the finger on the touch panel 5. Therefore, the user can confirm the icon he / she has selected at any time.
  • the information processing terminal 10 also predicts the direction of finger movement from the contact point between the touch panel 5 and the finger 10d between two extremely short points, determines the direction in which the finger does not wear, and places it around the selected icon. Select a place where there are no other icons, and display a balloon icon.
  • the information processing terminal 10 can prevent deterioration of the visibility of the icon with a stylus pen or a finger, and can reduce stress related to the user's operation and prevent an erroneous operation.
  • FIG. 10 is a diagram for explaining a display example of a balloon-type icon.
  • the information processing terminal 10 displays a balloon-type icon J of the icon J in the empty area. Then, while the icon J is moving, the information processing terminal 10 repeatedly executes the prediction of the direction of the finger 10d and the search for the empty area, and continues to display the balloon-type icon J following the movement of the icon J.
  • the information processing terminal 10 can improve the operability associated with the icon operation.
  • the information processing terminal 10 can dynamically change the size of the balloon-type icon according to the size of the free space, the convenience of the user can be improved.
  • the balloon icon described in the above embodiment is an example, and the present invention is not limited to this.
  • the information processing terminal 10 can display an icon corresponding to the feature of the currently selected icon.
  • the information processing terminal 10 can also display a camera icon.
  • not only the balloon type icon but also the same icon as the selected icon or a specific icon can be displayed.
  • the information processing terminal 10 can display a balloon-type icon in a small size according to the size of the area. At this time, an icon corresponding to the feature of the selected icon can be displayed so that the selected icon can be seen at a glance.
  • FIG. 11 is a diagram illustrating a display example of a transmissive icon. As illustrated in FIG. 11, the information processing terminal 10 displays the transparent balloon-type icon J when the icon J is selected but there is no free area within a certain distance from the icon J. At this time, the information processing terminal 10 displays a transmissive balloon-type icon J at a position that does not cover the direction of the finger 10d. By doing in this way, it can suppress that an alternative icon is displayed away from the icon currently selected.
  • the information processing terminal 10 can suppress the display of the substitute icon when the terminal screen is sufficiently large. Specifically, the information processing terminal 10 acquires the screen size from the OS, and suppresses the display of the substitute icon when the screen size is larger than a predetermined size (for example, 10 inches or more).
  • the setting of the screen size is not limited to this method, and may be a user setting or may be automatically set by a learning function.
  • the information processing terminal 10 can suppress the display of the substitute icon when the screen is output to an external device using wireless or wired. Specifically, the information processing terminal 10 suppresses Wi-Fi (Wireless Fidelity) direct, Miracast, screen casting, and the like in a setting state that can be acquired from the OS.
  • Wi-Fi Wireless Fidelity
  • the information processing terminal 10 uses an adapter that converts from a microUSB (Universal Serial Bus) terminal to HDMI (High-Definition Multimedia Interface) (registered trademark) by using an MHL (Mobile High-definition Link) function, thereby outputting HDMI.
  • a microUSB Universal Serial Bus
  • HDMI High-Definition Multimedia Interface
  • MHL Mobile High-definition Link
  • the information processing terminal 10 suppresses a balloon-type icon display or the like when the screen is displayed on a large screen instead of a small screen. Therefore, since it is possible to suppress the state where the balloon-type icon is displayed on the large screen, the user's troublesomeness can be reduced.
  • the information processing terminal 10 has described the example in which the balloon type icon is displayed in the nearest empty area.
  • the present invention is not limited to this.
  • the information processing terminal 10 can display a balloon-type icon in the widest empty area.
  • the size of the balloon-type icon can be changed in accordance with the size of the empty area. By doing so, it is possible to prevent the balloon-type icon from becoming smaller, and thus it is possible to ensure a certain level of visibility.
  • Display Information In the said Example, although the icon was illustrated as an example of display information, it is not limited to this.
  • the information processing terminal 10 can apply the same processing to image editing of an image folder and editing of audio data of an audio folder.
  • the information processing terminal 10 can also recognize the image of a hand from the camera screen of the front camera installed in the information processing terminal 10 and determine which hand is operating depending on whether the hand moves to the left or right of the screen. .
  • toe it is not limited to this, Even if it is a touch pen etc., it can process similarly.
  • each configuration of each apparatus shown in FIG. 3 or the like does not necessarily need to be physically configured as illustrated. That is, it can be configured to be distributed or integrated in arbitrary units.
  • the estimation unit 25 and the prediction unit 26 can be integrated.
  • all or any part of each processing function performed in each device may be realized by a CPU and a program analyzed and executed by the CPU, or may be realized as hardware by wired logic.
  • the processor 20 of the information processing terminal 10 operates as an information processing apparatus that executes a display control method by reading and executing a program stored in a hard disk or the like. That is, the processor 20 executes a program that executes the same functions as the touch panel control unit 23, the selection receiving unit 24, the estimation unit 25, the prediction unit 26, and the display control unit 27. As a result, the processor 20 can operate as the touch panel control unit 23, the selection receiving unit 24, the estimation unit 25, the prediction unit 26, and the display control unit 27.
  • the program referred to in the other embodiments is not limited to being executed by the information processing terminal 10.
  • the present invention can be similarly applied to a case where another computer or server executes the program or a case where these programs cooperate to execute the program.
  • This program can be distributed via a network such as the Internet.
  • This program is recorded on a computer-readable recording medium such as a hard disk, flexible disk (FD), CD-ROM, MO (Magneto-Optical disk), DVD (Digital Versatile Disc), and the like. It can be executed by being read.
  • Information processing terminal 20
  • Processor 21
  • Direction pattern DB 22
  • Search pattern DB 23
  • Touch Panel Control Unit 24
  • Selection Accepting Unit 25
  • Estimation Unit 26
  • Prediction Unit 27
  • Display Control Unit 28
  • Free Space Detection Unit 29 Display Output Unit

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Dans la présente invention, un terminal de traitement d'informations a un processeur et une mémoire. Le processeur du terminal de traitement d'informations reçoit via le panneau tactile et au moyen d'un corps de pointage une sélection d'informations d'affichage devant être déplacées par rapport aux informations d'affichage affichées sur une unité d'affichage. Ensuite, le processeur du terminal de traitement d'informations prédit la direction de déplacement du corps de pointage sur la base du contact entre le panneau tactile et le corps de pointage. Ensuite, le processeur du terminal de traitement d'informations identifie une région où les informations d'affichage et le corps de pointage ne se chevauchent pas sur la base de la direction de mouvement prédite du corps de pointage, et affiche des informations de substitution relatives aux informations d'affichage dans la région identifiée.
PCT/JP2016/057240 2016-03-08 2016-03-08 Dispositif de commande d'affichage, procédé de commande d'affichage et programme de commande d'affichage Ceased WO2017154119A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2018503902A JPWO2017154119A1 (ja) 2016-03-08 2016-03-08 表示制御装置、表示制御方法および表示制御プログラム
PCT/JP2016/057240 WO2017154119A1 (fr) 2016-03-08 2016-03-08 Dispositif de commande d'affichage, procédé de commande d'affichage et programme de commande d'affichage

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2016/057240 WO2017154119A1 (fr) 2016-03-08 2016-03-08 Dispositif de commande d'affichage, procédé de commande d'affichage et programme de commande d'affichage

Publications (1)

Publication Number Publication Date
WO2017154119A1 true WO2017154119A1 (fr) 2017-09-14

Family

ID=59790211

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/057240 Ceased WO2017154119A1 (fr) 2016-03-08 2016-03-08 Dispositif de commande d'affichage, procédé de commande d'affichage et programme de commande d'affichage

Country Status (2)

Country Link
JP (1) JPWO2017154119A1 (fr)
WO (1) WO2017154119A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011028560A (ja) * 2009-07-27 2011-02-10 Sony Corp 情報処理装置、表示方法及び表示プログラム
JP2013130979A (ja) * 2011-12-20 2013-07-04 Sharp Corp 情報処理装置、情報処理装置の制御方法、情報処理装置制御プログラムおよび該プログラムを記録したコンピュータ読み取り可能な記録媒体
US20140028557A1 (en) * 2011-05-16 2014-01-30 Panasonic Corporation Display device, display control method and display control program, and input device, input assistance method and program

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008176448A (ja) * 2007-01-17 2008-07-31 Casio Comput Co Ltd 画像表示装置及び画像表示方法
JP2009193423A (ja) * 2008-02-15 2009-08-27 Panasonic Corp 電子機器の入力装置
JP2011081447A (ja) * 2009-10-02 2011-04-21 Seiko Instruments Inc 情報処理方法及び情報処理装置
JP2015153197A (ja) * 2014-02-14 2015-08-24 Clinks株式会社 ポインティング位置決定システム
JP5969551B2 (ja) * 2014-07-22 2016-08-17 日本電信電話株式会社 マルチタッチスクリーン付き携帯端末およびその動作方法

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011028560A (ja) * 2009-07-27 2011-02-10 Sony Corp 情報処理装置、表示方法及び表示プログラム
US20140028557A1 (en) * 2011-05-16 2014-01-30 Panasonic Corporation Display device, display control method and display control program, and input device, input assistance method and program
JP2013130979A (ja) * 2011-12-20 2013-07-04 Sharp Corp 情報処理装置、情報処理装置の制御方法、情報処理装置制御プログラムおよび該プログラムを記録したコンピュータ読み取り可能な記録媒体

Also Published As

Publication number Publication date
JPWO2017154119A1 (ja) 2019-01-10

Similar Documents

Publication Publication Date Title
US10627990B2 (en) Map information display device, map information display method, and map information display program
CN102934067B (zh) 信息处理系统、操作输入装置、信息处理装置、信息处理方法
US10073493B2 (en) Device and method for controlling a display panel
US20090066659A1 (en) Computer system with touch screen and separate display screen
JP5620440B2 (ja) 表示制御装置、表示制御方法及びプログラム
EP2560086B1 (fr) Procédé et appareil de navigation dans le contenu d'un écran en utilisant un dispositif de pointage
AU2017203910B2 (en) Glove touch detection
US20140176510A1 (en) Input device, input assistance method and program
KR20140105691A (ko) 터치스크린을 구비하는 사용자 기기의 오브젝트 조작 방법 및 그 장치
JP2014154055A (ja) 画像処理装置及び画像処理方法
US20130100063A1 (en) Touch panel device
JP2014041391A (ja) タッチパネル装置
US20110258555A1 (en) Systems and methods for interface management
US20150355819A1 (en) Information processing apparatus, input method, and recording medium
JP5949010B2 (ja) 入力制御装置、入力制御プログラム、及び入力制御方法
JP2015141526A (ja) 情報処理装置、情報処理方法、及びプログラム
JP6411067B2 (ja) 情報処理装置及び入力方法
JP5820414B2 (ja) 情報処理装置及び情報処理方法
JPWO2012124279A1 (ja) 入力装置
JP5675486B2 (ja) 入力装置及び電子機器
JP5834253B2 (ja) 画像処理装置、画像処理方法、及び画像処理プログラム
JP2013182463A (ja) 携帯端末装置、タッチ操作制御方法、及びプログラム
US9996215B2 (en) Input device, display control method, and integrated circuit device
WO2017154119A1 (fr) Dispositif de commande d'affichage, procédé de commande d'affichage et programme de commande d'affichage
JPWO2015114938A1 (ja) 情報処理装置、情報処理方法、及びプログラム

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 2018503902

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16893451

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 16893451

Country of ref document: EP

Kind code of ref document: A1