WO2014061095A1 - Dispositif d'affichage d'informations et procédé de commande d'opérations dans un dispositif d'affichage d'informations - Google Patents
Dispositif d'affichage d'informations et procédé de commande d'opérations dans un dispositif d'affichage d'informations Download PDFInfo
- Publication number
- WO2014061095A1 WO2014061095A1 PCT/JP2012/076677 JP2012076677W WO2014061095A1 WO 2014061095 A1 WO2014061095 A1 WO 2014061095A1 JP 2012076677 W JP2012076677 W JP 2012076677W WO 2014061095 A1 WO2014061095 A1 WO 2014061095A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- window
- function
- control unit
- display device
- information display
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3664—Details of the user input interface, e.g. buttons, knobs or sliders, including those provided on a touch screen; remote controllers; input using gestures
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04808—Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
Definitions
- the present invention relates to an information display device and an operation control method in the information display device.
- Patent Document 1 discloses a navigation device using a touch panel. Specifically, when two finger positions are detected at the same time on the touch panel, an operation frame that is smaller than the display screen and has a similar shape is displayed on the display screen based on the two detected finger positions. The operation frame is displayed in the vicinity of the detected finger position or a predetermined peripheral edge in the display screen regardless of the detected finger position.
- the operation frame When the operation frame is displayed, the operation frame can be operated as if it were the entire display screen. For example, when the two finger positions are moved closer to each other within the operation frame, the map displayed on the display screen is reduced. That is, the movement of the finger position in the operation frame is converted as the movement of the finger position in the entire display screen. At this time, the change rate of the finger position in the operation frame based on the operation frame is set to the change rate of the finger position in the entire display screen based on the entire display screen.
- the display of the entire display screen is controlled according to the operation performed in the operation frame. Further, the control amount of display in the entire display screen is determined according to the operation amount (that is, the movement amount of the finger position) performed in the operation frame. That is, the operation in the operation frame and the display on the entire display screen are linked. For this reason, it is considered that the operation frame can be applied only for limited purposes.
- the object of the present invention is to provide a technique capable of improving the operability, in other words, convenience, for the operation of display information.
- An information display device includes a display unit having a display surface, an input unit that receives a user operation, and a control unit.
- the control unit forms a window according to the forming range specified by the window opening operation. .
- the control unit assigns different functions to be executed depending on a user operation on the window inner area which is an area inside the window and a user operation on the window outer area which is an area outside the window.
- the function to be executed is assigned differently between the user operation for the window inner area and the user operation for the window outer area. For example, by assigning a user operation that is difficult to perform efficiently in the window outer area to the window inner area, it is possible to efficiently perform a function instruction by the user operation. Also, for example, even if different functions are assigned to the same user operation in the state before the window formation, it is not possible to identify which function is instructed, whereas by providing a distinction between the window inner area and the window outer area, Those different functions can be indicated appropriately. As in these examples, high operability, in other words, high convenience can be provided.
- FIG. 1 illustrates a block diagram of an information display device 10 according to an embodiment.
- the information display device 10 includes a display unit 12, an input unit 14, a control unit 16, and a storage unit 18.
- Display unit 12 displays various information.
- the display unit 12 drives each pixel based on, for example, a display surface configured by arranging a plurality of pixels in a matrix and image data acquired from the control unit 16 (in other words, each pixel's Driving device for controlling the display state).
- the image displayed on the display unit 12 may be a still image, a moving image, or a combination of a still image and a moving image.
- the display unit 12 can be configured by a liquid crystal display device, for example.
- a display area of a display panel corresponds to the display surface
- a drive circuit externally attached to the display panel corresponds to the drive device.
- a part of the driver circuit may be incorporated in the display panel.
- the display unit 12 can be configured by an electroluminescence (EL) display device, a plasma display device, or the like.
- EL electroluminescence
- the input unit 14 receives various information from the user.
- the input unit 14 includes, for example, a detection unit that detects an indicator used by the user for input, and a detection signal output unit that outputs a result detected by the detection unit to the control unit 16 as a detection signal. .
- the input unit 14 is configured by a so-called contact-type touch panel
- the input unit 14 may be referred to as a “touch panel 14” below.
- the touch panel may be referred to as a “touch pad” or the like.
- indication used for an input is a user's finger
- the detection unit of the touch panel 14 provides an input surface on which a user places a fingertip, and detects the presence of a finger on the input surface by a sensor group provided for the input surface.
- a sensor group provided for the input surface.
- an area where a finger can be detected by the sensor group corresponds to an input area where a user input can be received.
- the input area corresponds to an input surface of a two-dimensional area.
- the sensor group may be any of electrical, optical, mechanical, etc., or a combination thereof.
- Various position detection methods have been developed, and any of them may be adopted for the touch panel 14.
- a configuration capable of detecting the pressing force of the finger on the input surface may be employed.
- the position of the fingertip on the input surface can be specified from the combination of the output signals of each sensor.
- the identified position is expressed by coordinate data on coordinates set on the input surface, for example.
- the coordinate data indicating the finger position changes, so that the movement of the finger can be detected by a series of coordinate data acquired continuously.
- the finger position may be expressed by a method other than coordinates. That is, the coordinate data is an example of finger position data for expressing the position of the finger.
- the detection signal output unit of the touch panel 14 generates coordinate data indicating the finger position from the output signals of the sensors, and transmits the coordinate data to the control unit 16 as a detection signal.
- the conversion to coordinate data may be performed by the control unit 16.
- the detection signal output unit converts the output signal of each sensor into a signal in a format that can be acquired by the control unit 16, and transmits the obtained signal to the control unit 16 as a detection signal.
- the input surface 34 of the touch panel 14 (see FIG. 1) and the display surface 32 of the display unit 12 (see FIG. 1) are overlapped, in other words, the input surface 34.
- a structure in which the display surface 32 is integrated is illustrated. With such an integrated structure, the input / display unit 20 (see FIG. 1), more specifically, the touch screen 20 is provided.
- the input surface 34 and the display surface 32 are identified to the user, giving the user the feeling of performing an input operation on the display surface 32. For this reason, an intuitive operation environment is provided.
- the expression “the user operates the display surface 32” may be used.
- the control unit 16 performs various processes and controls in the information display device 10. For example, the control unit 16 analyzes information input from the touch panel 14, generates image data according to the analysis result, and outputs the image data to the display unit 12.
- control unit 16 includes a central processing unit (for example, configured with one or a plurality of microprocessors) and a main storage unit (for example, one or a plurality of storage devices such as ROM, RAM, flash memory, etc.). ).
- a central processing unit for example, configured with one or a plurality of microprocessors
- main storage unit for example, one or a plurality of storage devices such as ROM, RAM, flash memory, etc.
- main storage unit for example, one or a plurality of storage devices such as ROM, RAM, flash memory, etc.
- Various programs may be stored in the main storage unit of the control unit 16 in advance, or may be read from the storage unit 18 during execution and stored in the main storage unit.
- the main storage unit is used not only for storing programs but also for storing various data.
- the main storage unit provides a work area when the central processing unit executes the program.
- the main storage unit provides an image holding unit for writing an image to be displayed on the display unit 12.
- the image holding unit may be referred to as “video memory”, “graphic memory”, or the like.
- control unit 16 may be configured as hardware (for example, an arithmetic circuit configured to perform a specific calculation).
- the storage unit 18 stores various information.
- the storage unit 18 is provided as an auxiliary storage unit used by the control unit 16.
- the storage unit 18 can be configured using one or more storage devices such as a hard disk device, an optical disk, a rewritable and nonvolatile semiconductor memory, and the like.
- the information display device 10 may further include elements other than the above elements 12, 14, 16, and 18.
- an audio output unit that outputs auditory information
- a communication unit that performs wired or wireless communication with various devices
- the current position of the information display device 10 conform to, for example, a GPS (Global Positioning System) system.
- GPS Global Positioning System
- One or more of the current position detection units to be detected may be added.
- the voice output unit can output, for example, operation sounds, sound effects, guidance sounds, and the like.
- the communication unit can be used for, for example, new acquisition and update of information stored in the storage unit 18. Further, the current position detection unit can be used for a navigation function, for example.
- the information display device 10 may be a portable or desktop information device.
- the information display device 10 may be applied to a navigation device or an audio / visual device mounted on a moving body such as an automobile.
- touch operation is an operation in which at least one fingertip is brought into contact with the input surface of the touch panel and the contacted finger is moved away from the input surface without being moved on the input surface.
- gesture operation is an operation in which at least one fingertip is brought into contact with the input surface, and the contacted finger is moved on the input surface (in other words, slid), and then released from the input surface. .
- the coordinate data detected by the touch operation (in other words, finger position data) is basically static and static.
- the coordinate data detected by the gesture operation changes with time and is dynamic. According to such a series of changing coordinate data, the point where the finger starts moving and the point on the input surface, the locus from the moving start point to the moving end point, the moving direction, the moving amount, the moving speed, the moving acceleration, Etc. can be acquired.
- FIG. 3 is a conceptual diagram illustrating a one-point touch operation (also simply referred to as “one-point touch”) as a first example of the touch operation.
- a top view of the input surface 34 is shown in the upper stage, and a side view or a sectional view of the input surface 34 is shown in the lower stage.
- touch points in other words, finger detection points
- black circles Such an illustration technique is also used in the drawings described later.
- a black circle may be actually displayed on the display surface.
- One-point touch can be classified into single tap, multi-tap and long press operations, for example.
- Single tap is an operation of tapping the input surface 34 once with a fingertip.
- a single tap is sometimes simply referred to as a “tap”.
- Multi-tap is an operation of repeating a tap a plurality of times.
- a double tap is a typical multi-tap.
- the long press is an operation for maintaining the point contact of the fingertip.
- FIG. 4 is a conceptual diagram illustrating a two-point touch operation (also simply referred to as “two-point touch”) as a second example of the touch operation.
- the two-point touch is basically the same as the one-point touch except that two fingers are used. For this reason, it is possible to perform each operation of tap, multi-tap, and long press, for example, by two-point touch.
- two fingers of one hand may be used, or one finger of the right hand and one finger of the left hand may be used.
- the positional relationship between the two fingers is not limited to the example in FIG.
- FIG. 5 is a conceptual diagram illustrating a drag operation (also simply referred to as “drag”) as a first example of the gesture operation.
- Dragging is an operation of shifting the fingertip while it is placed on the input surface 34.
- the moving direction and moving distance of the finger are not limited to the example in FIG.
- the movement start point of the finger is schematically shown by a black circle
- the movement end point of the finger is schematically shown by a black triangle
- the direction of movement of the finger is expressed by the direction of the triangle
- the black circle The trajectory is represented by a line connecting the triangle and the black triangle.
- Such an illustration technique is also used in the drawings described later. Note that the black circle, the black triangle, and the locus may be actually displayed on the display surface.
- FIG. 6 is a conceptual diagram illustrating a flick operation (also simply referred to as “flick”) as a second example of the gesture operation.
- the flick is an operation for quickly paying the fingertip on the input surface 34.
- the moving direction and moving distance of the finger are not limited to the example in FIG.
- flicking unlike dragging, the finger leaves the input surface 34 while moving.
- the touch panel 14 is a contact type, the finger movement after leaving the input surface 34 is not detected in principle.
- a flick can be identified when the moving speed is equal to or higher than a predetermined threshold (referred to as “drag / flick identification threshold”).
- the point where the finger finally arrives after it leaves the input surface 34 (more specifically, the point is defined as the input surface 34).
- this estimation process can be interpreted as a process of converting a flick into a virtual drag.
- the information display apparatus 10 treats the estimated point as the end point of finger movement.
- the estimation process may be executed by the touch panel 14 or may be executed by the control unit 16.
- the information display device 10 may be modified so that a point away from the input surface 34 is handled as an end point of finger movement.
- FIG. 7 is a conceptual diagram illustrating a pinch out operation (also simply referred to as “pinch out”) as a third example of the gesture operation.
- Pinch out is an operation of moving two fingertips away on the input surface 34.
- Pinch out is also called “pinch open”.
- FIG. 7 illustrates the case where both two fingers are dragged.
- one fingertip is fixed on the input surface 34 (in other words, one fingertip maintains a touch state), and only the other fingertip is held. It is also possible to pinch out by dragging. 7 and 8 are referred to as “two-point movement type”, and the method in FIG. 8 is referred to as “one-point movement type”.
- FIG. 9 is a conceptual diagram illustrating a pinch-in operation (also simply referred to as “pinch-in”) as a fifth example of the gesture operation.
- Pinch-in is an operation of bringing two fingertips closer on the input surface 34.
- Pinch-in is also referred to as “pinch close”.
- FIG. 9 illustrates a two-point movement type pinch-in
- FIG. 10 illustrates a one-point movement type pinch-in as a sixth example of the gesture operation.
- pinch-out and pinch-in are collectively referred to as “pinch operation” or “pinch”, and the direction of finger movement is referred to as “pinch direction”.
- the pinch operation is particularly referred to as pinch out.
- the pinch operation is particularly called pinch-in.
- pinch-out and pinch-in two fingers of one hand may be used, or one finger of the right hand and one finger of the left hand may be used. Further, the positional relationship, the moving direction, and the moving distance of the two fingers are not limited to the examples in FIGS. Further, in the one-point movement type pinch-out and pinch-in, the finger to be dragged is not limited to the examples of FIGS. 8 and 10. It is also possible to pinch out and pinch in using flicks instead of dragging.
- Each user operation is associated with a specific function. Specifically, when a user operation is detected, a process associated with the user operation is executed by the control unit 16, thereby realizing a corresponding function. In view of this point, user operations can be classified based on functions to be realized.
- a double tap performed on an icon on the display surface 32 is associated with a function for executing a program or command associated with the icon.
- the double tap functions as an execution instruction operation.
- dragging performed on display information is associated with a slide function for sliding the display information.
- the drag operation functions as a slide operation.
- the slide function and the slide operation are also referred to as a scroll function and a scroll operation.
- the slide direction and the scroll direction differ by 180 °.
- pinch out and the pinch in performed on the display information changes the size (in other words, the scale) of the information display.
- pinch-out and pinch-in function as a display size change operation (may be referred to as a “display scale change operation”). More specifically, in the example of FIG. 12, pinch out corresponds to an enlargement operation, and pinch in corresponds to a reduction operation.
- the drag is associated with a function that rotates the information display.
- the two-point movement type rotary drag functions as a rotation operation.
- the function of associating may be changed according to the number of fingers to be rotated and dragged.
- a double tap may be assigned to a folder opening operation for opening a folder associated with an icon in addition to the above execution instruction operation.
- the drag may be assigned to a slide function and a drawing function.
- the execution instruction function for the icon may be associated with double tap, long press, and flick.
- a program or the like associated with the icon can be executed by any of double tap, long press and flick.
- the slide function may be associated with both dragging and flicking.
- the rotation function may be associated with both the two-point movement type rotation drag and the one-point movement type rotation drag.
- a gesture operation associated with a screen movement deformation type function may be expressed as “a screen movement deformation type function gesture operation”.
- the screen movement deformation type function associated with the gesture operation is a function of controlling (in other words, manipulating) display information on the display surface in a control direction set according to the gesture direction.
- the screen movement deformation type function includes, for example, a slide function, a display size change function, a rotation function, and a bird's eye view display function (more specifically, an elevation angle and depression angle change function).
- the slide function can be classified as a screen movement function. Further, if the rotation function is viewed from the viewpoint of angle movement, the rotation function can be classified as a screen movement function. Further, the display size changing function and the bird's eye view display function can be classified into screen deformation functions.
- a slide direction (that is, a control direction) is set according to a gesture direction (for example, a drag direction or a flick direction), and display information is slid in the slide direction.
- a gesture direction for example, a drag direction or a flick direction
- the control direction when the gesture direction (for example, pinch direction) is the enlargement direction, the control direction is set to the enlargement direction, and when the gesture direction is the reduction direction, the control direction is set to the reduction direction. Then, the display information size is changed in the set control direction.
- the gesture direction for example, pinch direction
- the control direction when the gesture direction is the reduction direction, the control direction is set to the reduction direction. Then, the display information size is changed in the set control direction.
- control direction is set to the right rotation direction when the gesture direction (for example, the rotation direction in the rotation drag) is the right rotation direction, and the control direction is set to the left rotation direction when the gesture direction is the left rotation direction.
- the counter is set to the left rotation direction, and the display information is rotated in the set control direction.
- the screen movement / deformation function may control the display information by using not only the gesture direction but also the gesture amount (for example, the length of the gesture trajectory).
- the control amount for example, the slide amount, the display size change amount, and the rotation amount
- the control amount may be set larger as the gesture amount is larger.
- the screen movement deformation type function may control the display information using the gesture speed in addition to or instead of the gesture amount.
- the display information control speed for example, slide speed, display size change speed, and rotation speed
- the display information control speed may be set higher as the gesture speed is higher.
- the non-moving deformation type function does not use the gesture direction for realizing the function even if it is associated with the gesture operation. For example, even if a flick to an icon is associated with an execution instruction function of a specific program, the function belongs to the non-moving deformation type. For example, when dragging is used for the drawing function and the handwritten character input function, only the trajectory corresponding to the dragging is displayed, and the display information is not controlled according to the dragging direction.
- FIG. 14 illustrates a block diagram of the control unit 16.
- the display unit 12, the input unit 14, and the storage unit 18 are also illustrated for explanation.
- the control unit 16 includes an input analysis unit 40, an overall control unit 42, a first image forming unit 44, a first image holding unit 46, a second image forming unit 48, A two-image holding unit 50, an image composition unit 52, a composite image holding unit 54, and a window management unit 56 are included.
- the input analysis unit 40 analyzes the user operation detected by the input unit 14 and identifies the user operation. Specifically, the input analysis unit 40 acquires coordinate data detected along with a user operation from the input unit 14, and acquires user operation information from the coordinate data.
- the user operation information is, for example, information such as the type of user operation, the start and end points of finger movement, the trajectory from the start point to the end point, the moving direction, the moving amount, the moving speed, and the moving acceleration.
- the difference between the start point and the end point can be distinguished from a predetermined threshold value (referred to as a “touch / gesture identification threshold value”) to identify the touch operation and the gesture operation. It is. Further, as described above, the drag and the flick can be identified from the finger moving speed at the end of the trajectory.
- pinch out and pinch in can be identified from the moving direction.
- the two drags draw a circle while maintaining a distance, it is possible to identify that the rotation drag has been performed.
- a drag and a one-point touch are identified at the same time, it can be identified that the pinch-out, the pinch-in, and the rotational drag are one-point moving types.
- the overall control unit 42 performs various processes in the control unit 16. For example, the overall control unit 42 associates a position on the input surface of the input unit 14 with a position on the display surface of the display unit 12. According to this, the touch position in the touch operation, the gesture locus in the gesture operation, and the like are associated on the display surface. With such association, it is possible to identify the position on the display surface where the user operation is intended. Such association can be realized by a so-called graphical user interface (GUI) technique.
- GUI graphical user interface
- the overall control unit 42 identifies a function desired by the user, that is, a user instruction based on, for example, user operation information and function identification information.
- the function identification information is, for example, information in which an association between a user operation and a function to be executed is defined through the operation status information.
- the operation status information includes, for example, the usage status of the information display device 10 (in other words, the usage mode), the operation target on which the user operation has been performed, the type of user operation that can be accepted according to the usage status and the operation target, etc. Information.
- the drag is identified as instructing execution of the slide function.
- the tap is identified as instructing execution of the display size enlargement function. For example, if no function is associated with the flick for the enlarged icon, it is determined that the flick is an invalid operation.
- the overall control unit 42 controls the display information on the display surface by controlling the first image forming unit 44, the second image forming unit 48, and the image composition unit 52.
- the display information may be changed based on the identification result of the user instruction, or may be based on an instruction on program execution irrespective of the identification result of the user instruction.
- the overall control unit 42 performs general control on the other functional units 40, 44, 46, 48, 50, 52, 54, and 56, for example, adjustment of execution timing.
- the first image forming unit 44 reads the first information 60 according to the instruction from the overall control unit 42 from the storage unit 18, forms the first image from the first information 60, and converts the first image into the first image holding unit 46.
- the second image forming unit 48 reads the second information 62 according to the instruction from the overall control unit 42 from the storage unit 18, forms a second image from the second information 62, and converts the second image into the second image. Store in the holding unit 50.
- the image composition unit 52 reads the first image from the first image holding unit 46, reads the second image from the second image holding unit 50, and combines the first image and the second image.
- the synthesized image is stored in the synthesized image holding unit 54.
- the image composition is performed so that the first image and the second image are displayed in an overlapping manner.
- first image is the lower image (in other words, the lower layer) and the second image is the upper image (in other words, the upper layer) is illustrated.
- up and down means up and down in the normal direction of the display surface, and the side closer to the user viewing the display surface is expressed as “up”.
- image data is superimposed based on such a concept.
- the lower image is displayed in the transparent portion of the upper image.
- the drawing portion of the upper image hides the lower image.
- a composite image in which the lower image is transparent can be formed.
- the setting of which of the first image and the second image is the upper image may be unchangeable or may be changeable.
- the composite image stored in the composite image holding unit 54 is transferred to the display unit 12 and displayed on the display unit 12.
- the display screen changes when the composite image is updated, that is, when at least one of the first image and the second image is updated.
- the window management unit 56 manages windows formed on the display surface under the control of the overall control unit 42. Specifically, the window management unit 56 manages information such as the window formation range (position, shape, etc.), display attributes (whether or not the window is modified and the type, etc.), and based on the window management information. The window is managed by controlling the image composition unit 52.
- FIG. 15 illustrates a processing flow S10 up to window formation.
- the input unit 14 receives a user operation
- the control unit 16 identifies the input user operation.
- the control unit 16 determines whether or not the input user operation is a window opening operation defined in advance based on the identification result in step S12.
- control unit 16 executes a function associated with the input user operation in step S14. Thereafter, the processing of the information display device 10 returns to step S11.
- step S15 a mode for changing the assignment of the function executed by the user operation inside and outside the window (hereinafter referred to as “special operation mode”) is turned on. That is, in the special operation mode, at least a part of the types of functions that can be instructed from the window inner area that is the area inside the window and the types of functions that can be instructed from the window outer area that is the area outside the window are Different in function.
- a function assigned to the window inner area and a function assigned to the window outer area are recorded in advance, and the recorded information (in other words, assignment information) is stored in the main storage section of the control section 16 (see FIG. 14) or It is stored in the storage unit 18 (see FIG. 14).
- the special operation mode is turned on. The function assignment in the special operation mode will be described in detail later.
- the window opening operation is an operation for instructing to form a window on the display surface and an operation for designating a window forming range. Examples of window opening operations are shown in FIGS.
- the long press operation of two-point touch is assigned to the window opening operation.
- the window 80 is formed in a rectangular range having the two touched points as the vertices of the diagonal position. That is, the user designates the formation range of the window 80 by two touch points. Note that the positional relationship between the two points to be touched is not limited to the example of FIG.
- FIG. 16 in order to make the window 80 easy to understand on the drawing, an area in the window 80 is sanded. That is, this sand hatching is only given for the description of the embodiment, and does not limit the design of the window 80 or the like. Such sand hatching may also be used in later-described drawings such as FIG.
- step S12 when the control unit 16 identifies that the two-point touch state has reached a predetermined time (referred to as “window opening instruction time”) in step S12, In S13, it is determined that a window opening operation has been input. And the control part 16 forms the window 80 on the display surface 32 according to the range designated by window opening operation. For example, the control unit 16 associates the two touched points with the coordinate system of the display surface 32 and adopts the two points on the display surface 32 as the vertices of the diagonal position to form the rectangular window 80.
- the formation range of the window 80 may be obtained on the coordinate system of the input surface 34, and the obtained range may be associated with the coordinate system of the display surface 32.
- a one-point touch and a drag (referred to as “enclosed drag” or “enclosed gesture”) that surrounds an arbitrary range starting from the one-point touched point or the vicinity thereof.
- the combination operation is assigned to the window opening operation.
- any of single tap, multi-tap and long press can be adopted.
- the end of the drag may be a flick.
- the vicinity of the one-point touch point refers to a range of a predetermined distance from the one-point touch point, for example.
- the user designates the formation range of the window 80 by the range surrounded by the surrounding drag or by the combination of the one-point touch point and the range surrounded by the surrounding drag.
- the direction of the surrounding drag is not limited to the example of FIG.
- control unit 16 when the control unit 16 identifies in step S12 that the one-point touch and the surrounding drag have been continuously performed, it determines that the window opening operation is input in step S13.
- the condition “continuously” includes a condition that a one-point touch and a surrounding drag are performed within a predetermined operation time interval, and a condition that no other operation is performed in the middle.
- the control part 16 forms the window 80 on the display surface 32 according to the range designated by window opening operation.
- the control unit 16 associates the encircling drag locus 70 with the coordinate system of the display surface 32, and defines a range surrounded by the locus 70 on the coordinate system of the display surface 32 according to a predetermined conversion rule.
- a window 80 is formed in the converted rectangular range.
- the conversion rule for example, it is possible to adopt a rule for obtaining the maximum rectangle contained in the range of the enclosed locus 70.
- the window formation range may be determined so that the point of the one-point touch performed at the beginning of the window opening operation becomes one vertex of the window 80.
- the formation range of the window 80 may be obtained on the coordinate system of the input surface 34, and the obtained range may be associated with the coordinate system of the display surface 32.
- the window 80 can be formed at the execution position of the window opening operation.
- the window opening operation itself is a natural operation imagining the window 80 to be formed. For this reason, the window 80 can be formed intuitively and high operability can be realized.
- the window 80 is formed at the execution position of the window opening operation, the information in the window 80 can be viewed without moving the viewpoint greatly. For this reason, a user's cognitive load may be small.
- window opening operation is not limited to the examples of FIGS. 16 and 17.
- Various user operations or combinations thereof can be pre-assigned as window opening operations.
- the window 80 is drawn with a simple thick frame in order to avoid complication of the drawings.
- the design of the window 80 is not limited to this.
- the shaded modification exemplified in FIG. 18 and the dimple modification exemplified in FIG. 19 may be employed.
- the shadow modification it is possible to give an impression that the window portion is positioned above the surroundings.
- the depression modification it is possible to give an impression that the window portion is positioned below the surroundings.
- the window 80 is not limited to a quadrangle.
- the setting of the modification regarding the window 80 may be fixed to one type or may be selected according to the information displayed on the window 80. Alternatively, the user may be able to set and change. The same applies to the shape of the window 80.
- the formation range of the window 80 set according to the window opening operation is managed by the window management unit 56 as described above. More specifically, in the example of FIG. 14, when the overall control unit 42 detects an input of a window opening operation, the overall control unit 42 includes window management information including the formation range and display attributes of the window 80 according to the window opening operation. And the determined information is recorded in the window management unit 56. Here, for the display attribute of the window 80, a setting value (for example, an initial setting value) effective at that time is applied. The overall control unit 42 may record only the formation range of the window 80 in the window management unit 56, and the window management unit 56 may add display attributes accordingly.
- a setting value for example, an initial setting value
- the window management unit 56 controls the synthesis of the first image and the second image in the image synthesis unit 52 based on the window management information stored in itself.
- An example of the control will be described with reference to FIG. 21 and FIG. In the examples of FIGS. 21 and 22, it is assumed that the lower layer is the first image and the upper layer is the second image.
- the image composition unit 52 excludes the portion corresponding to the formation range of the window 80 in the second image stored in the second image holding unit 50. read out.
- the image composition unit 52 reads the first image stored in the first image holding unit 46, including the part corresponding to the formation range of the window 80. Then, the image composition unit 52 sets the read first image and second image as the lower layer and the upper layer, respectively, and synthesizes both images.
- the first image of the lower layer is displayed in the window inner area 82 that is an area inside the window 80
- the upper layer first image is displayed in the window outer area 84 that is an area outside the window 80.
- Two images are displayed.
- the transparency of the upper layer is set to 0%.
- the image composition unit 52 may read the second image constituting the upper layer, including the part corresponding to the formation range of the window 80. In this case, the image composition unit 52 sets the transparency of the portion corresponding to the formation range of the window 80 in the read second image to 100% under the control of the window management unit 56 and composes it with the first image. .
- the image composition unit 52 under the control of the window management unit 56, the image composition unit 52 reads a portion corresponding to the formation range of the window 80 in the second image stored in the second image holding unit 50. On the other hand, the image composition unit 52 reads the first image stored in the first image holding unit 46, including the part corresponding to the formation range of the window 80. Then, the image composition unit 52 sets the read first image and second image as the lower layer and the upper layer, respectively, and synthesizes both images.
- the second image of the upper layer is displayed in the window inner area 82, and the first image of the lower layer is displayed in the window outer area 84.
- the transparency of the upper layer is set to 0%.
- the image composition unit 52 may read the second image constituting the upper layer, including a part other than the part corresponding to the formation range of the window 80. In this case, under the control of the window management unit 56, the image composition unit 52 sets the transparency to 100% except for the portion corresponding to the formation range of the window 80 in the read second image, and sets the first image. And synthesize.
- image composition unit 52 modifies the window 80 as necessary, for example, after composition of the upper layer and the lower layer under the control of the window management unit 56.
- FIGS. 21 and 22 an example is shown in which a map (more specifically, a map at the time of navigation) is displayed in the window outer area 84. Further, in FIGS. 16 and 17, an example in which the display of the window outside region does not change before and after the formation of the window 80 is given. However, it is not limited to these examples.
- the display in the window inner area 82 may or may not be related to the display information of the window outer area 84. Further, the window inner area 82 may be in a state where specific information is not displayed (for example, the entire area is displayed in white).
- FIG. 23 illustrates a processing flow S30 after the window is formed.
- steps S31 and S32 are the same as steps S11 and S12 of FIG. That is, in step S31, the input unit 14 receives a user operation, and in step S32, the control unit 16 identifies the input user operation.
- step S33 the control unit 16 identifies whether the user operation received in step S31 is for the window outer area 84, the window inner area 82, or the window 80 itself. . At that time, it is possible to identify whether or not the input position of the user operation is related to the window 80 by referring to the management information by the window management unit 56 (see FIG. 14).
- step S34 the control unit 16 identifies the identification result that the user operation has been performed in the window outer area 84 and the input user operation. Perform functions associated with both types.
- step S35 the control unit 16 determines that the user operation has been performed on the window inner area 82 and the input user. Perform the function associated with both the type of operation and both.
- FIG. 24 illustrates a first example of function assignment.
- FIG. 24 assumes a usage situation in which a map is displayed in the window outside region 84 and scrolling (in other words, sliding) and enlarging / reducing the map are instructed.
- the scroll function is assigned to the window outer area 84, and the enlargement / reduction function is assigned to the window inner area 82. Therefore, the type of function assigned to the window outer area 84 is different from the type of function assigned to the window inner area 82.
- the scroll function is associated with dragging in the window outer area 84 and is not associated with any user operation in the window inner area 82. Therefore, when dragging is performed in the window outer area 84, the map displayed in the window outer area 84 is scrolled according to the drag direction and the drag amount. On the other hand, the scroll function cannot be executed from the window inner area 82.
- the enlargement / reduction function is not associated with any user operation in the window outer area 84, and a pinch operation is assigned in the window inner area 82. Accordingly, when pinching out is performed in the window inner area 82, the map displayed in the window outer area 84 is enlarged according to the pinch out amount. Conversely, when pinching is performed in the window inner area 82, the map displayed in the window outer area 84 is reduced according to the amount of pinch in. On the other hand, the enlargement / reduction function cannot be executed from the window outer area 84.
- a scroll function and an enlargement / reduction function are assigned to the window outer area 84, and a scroll function is assigned to the window inner area 82. Therefore, the type of function assigned to the window outer area 84 is different from the type of function assigned to the window inner area 82.
- the scroll function is associated with dragging in the window outer area 84 and is associated with touching the area 82 in the window inner area 82.
- the map scrolls in a preset direction (for example, the latest scroll direction).
- the enlargement / reduction function is associated with the pinch operation in the window outer area 84 and is not associated with any user operation in the window inner area 82.
- a scroll function and an enlargement / reduction function are assigned to the window outer area 84, and a scroll function having a larger scroll amount is assigned to the window inner area 82. . Therefore, the type of function assigned to the window outer area 84 is different from the type of function assigned to the window inner area 82.
- the scroll function instructed from the window outer area 84 is associated with the drag
- the scroll function instructed from the window inner area 82 is also associated with the drag.
- the ratio of the scroll amount to the drag amount is set larger. That is, if the drag amount is the same, a larger (in other words, longer) scroll can be executed by dragging in the window inner area 82.
- the function is different if the setting of the ratio of the control amount (here, the scroll amount) to the input operation amount (here, the drag amount) is different.
- the enlargement / reduction function is associated with the pinch operation in the window outer area 84 and is not associated with any user operation in the window inner area 82.
- different functions are assigned to the window inner area 82 and the window outer area 84 for the same operation of dragging.
- a plurality of functions can be instructed by dragging, which is convenient for the user.
- the flick in the window outer area 84 is assigned to the display information scroll function of the window outer area 84.
- a flick in the window inner area 82 is assigned to a function for changing an item to be focused in the display information in the window outer area 84 (for example, each time a flick is performed, a plurality of convenience stores displayed on the map are assigned. Will be focused one after another).
- a scroll function and an enlargement / reduction function are assigned to the window outer area 84, and a scroll function is assigned to the window inner area 82. Therefore, the type of function assigned to the window outer area 84 is different from the type of function assigned to the window inner area 82.
- the scroll function and the enlargement / reduction function instructed from the window outer area 84 are associated with the drag operation and the pinch operation, respectively.
- the scroll function is associated with an operation on the function icon (here, touch is exemplified).
- the control unit 16 displays a function icon (referred to as a scroll icon) for scroll operation in the window 80.
- FIG. 25 illustrates a scroll icon 110.
- the scroll icon 110 is displayed on the entire window 80, but the present invention is not limited to this example.
- the scroll icon 110 has eight operation units 110a to 110h respectively assigned to eight scroll directions in increments of 45 °.
- each of the operation units 110a to 110h is drawn with a design in which the vertices of vertical triangles are directed in the scroll direction.
- the number of operation units that is, the number in the scroll direction
- the design of the scroll icon, and the like are not limited to this example.
- the window inner area 82 when the operation unit in the direction to be scrolled is touched among the operation units 110a to 110h of the scroll icon 110, the map is scrolled according to the direction specified by the touch. Further, the scrolling is continued according to the touching time.
- the scroll icon 110 is displayed on the window 80 in the window forming step S15 (see FIG. 15).
- the icon display is managed by the window management unit 56 (see FIG. 14) under the control of the overall control unit 42 (see FIG. 14).
- the window management unit 56 reads out the image data of the scroll icon 110 from the storage unit 18 to the second image forming unit 48, and displays the image of the scroll icon 110 in a size corresponding to the size of the display surface. Instructing the formation and drawing of the formed scroll icon image on the transparent plane in accordance with the display position and storing in the second image holding unit 50.
- the window management unit 56 For erasing the scroll icon 110, the window management unit 56 causes the second image forming unit 48 to store an image that does not have the scroll icon image in the second image holding unit 50, for example. Further, the window management unit 56 instructs the image composition unit 52 to compose the images in the image holding units 46 and 50.
- a scroll function is assigned to the window outer area 84, and a scroll function and an enlargement / reduction function are assigned to the window inner area 82. Therefore, the type of function assigned to the window outer area 84 is different from the type of function assigned to the window inner area 82.
- the scroll function instructed from the window outer area 84 is associated with dragging. Further, the scroll function instructed from the window inner area 82 is associated with the touch of the scroll icon 110. In the window inner area 82, the enlargement / reduction function is associated with an operation on the function icon (here, touch is exemplified).
- the control unit 16 in the window forming step S15 (see FIG. 15), is referred to as a function icon (“enlargement / reduction icon” or “display size change icon” for enlargement / reduction operation). Is displayed in the window 80.
- FIG. 26 illustrates the enlargement / reduction icon 112.
- the enlargement / reduction icon 111 is displayed on the entire window 80, but the present invention is not limited to this example.
- the enlargement / reduction icon 112 includes an operation unit 112a for enlargement operation and an operation unit 112b for reduction operation.
- the enlargement / reduction icon 112 can be displayed in the same manner as the scroll icon 110 (see FIG. 25).
- the design of the enlargement / reduction icon, etc., is not limited to this example.
- FIG. 27 illustrates an icon 114 that combines the scroll icon 110 of FIG. 25 and the enlargement / reduction icon 112 of FIG. In the window forming step S15 (see FIG. 15), this combination icon 114 may be displayed.
- a scroll function and an enlargement / reduction function are assigned to the window outer area 84, and a scroll function is assigned to the window inner area 82. Therefore, the type of function assigned to the window outer area 84 is different from the type of function assigned to the window inner area 82.
- the scroll function and the enlargement / reduction function instructed from the window outer area 84 are associated with the touch of the scroll icon 110 and the enlargement / reduction icon 112, respectively.
- the scroll function instructed from the window inner area 82 is associated with dragging.
- a scroll function and an enlargement / reduction function are assigned to the window outer area 84, and a high-speed scroll function is assigned to the window inner area 82. Therefore, the type of function assigned to the window outer area 84 is different from the type of function assigned to the window inner area 82.
- the scroll function and the enlargement / reduction function instructed from the window outer area 84 are associated with the touch of the scroll icon 110 and the enlargement / reduction icon 112, respectively.
- the high speed scroll function instructed from the window inner area 82 is associated with the touch of the scroll icon 110. Note that the design of the icons may be different between the normal scroll icon and the high-speed scroll icon.
- the high-speed scroll function for the window inner area 82 has a larger scroll amount per unit time than the normal scroll function for the window outer area 84. In other words, if the touch time is the same, a larger (in other words, longer) scroll can be executed by dragging in the window inner area 82. Thus, in the example (vii), the function is different if the control amount (here, scroll speed) is set differently.
- a representative (in other words, frequently used) scroll function and enlargement / reduction function are assigned to icon operations, and the icons are assigned to the inside of the window.
- Providing in the region 82 is more efficient.
- frequently used icons for the scroll function and the enlargement / reduction function can be arranged at the user's favorite position in the desired size.
- high convenience can be provided by shortening the operation time.
- the scroll function and the enlargement / reduction function are often used for operations other than the map operation, it is highly versatile to arrange these function icons in the window inner area 82.
- FIG. 28 shows a second example of function assignment.
- a map is displayed in the window outside area 84, and a use situation is assumed in which the destination is determined while viewing the map.
- a scroll function and a destination determination function are assigned to the window outer area 84, and an enlargement / reduction function is assigned to the window inner area 82. Therefore, the type of function assigned to the window outer area 84 is different from the type of function assigned to the window inner area 82.
- the scroll function is associated with dragging in the window outer area 84.
- the destination determination function is associated with an operation of touching a point desired to be determined on the map displayed in the window outer area 84.
- the enlargement / reduction function is associated with a pinch operation in the window inner area 82.
- FIG. 29 shows a third example of function assignment.
- FIG. 29 assumes a usage situation in which a search keyword (a destination name or the like) is input.
- a character input function by handwritten character recognition is assigned to the window inner area 82
- a character input function by a software keyboard is assigned to the window outer area 84. According to this, the user can perform character input in a favorite manner, for example.
- a map may be displayed on the background of the software keyboard and the window 80.
- both the software keyboard and the handwriting input window 80 are displayed, but only one of them can be selectively displayed. For example, when the character input mode is turned on by a predetermined operation, only the software keyboard is displayed. When the window opening operation is performed with the software keyboard displayed, the window 80 providing the handwriting input area is displayed and the software keyboard is erased. As the window 80 is deleted, the software keyboard is displayed again. According to this, it is possible to avoid that display information is blocked by both the software keyboard and the window 80. Further, if characters are input using a window 80 that requires a relatively small area, the ratio of blocking display information can be reduced.
- the function allocation is made different between the window inner area 82 and the window outer area 84. For example, by assigning a user operation that is difficult to be performed efficiently in the window outer region 84 to the window inner region 82, it is possible to efficiently perform a function instruction by the user operation. See example (i) in FIG.
- the function assignment of the information display device 10 under various usage conditions is illustrated through the examples of FIGS. 24, 28, and 29. That is, the assignment of functions to the window inner area 82 and the window outer area 84 can be changed according to the usage status of the information display device 10. For this reason, by assigning an appropriate function according to the state of use, in other words, by assigning a function from the viewpoint of easy understanding for the user, quick operation, etc., an operation for realizing the user's intention Time can be shortened. As a result, high convenience can be provided.
- the user operation is not limited to the example of FIG.
- the functions include rotation (see FIG. 13), centering, item selection, item determination, range designation, search, narrowing down, and the like.
- user operations and functions can be associated in various ways.
- the user operation target may be not only a map, but also other graphics, photos, files (for example, music files), character strings (for example, playlists of music files), and the like.
- step S33 when it is identified in step S33 that the user operation is a window control operation for controlling the window 80 itself, the control unit 16 is assigned to the input window control operation in step S36.
- the window 80 is controlled according to the control content. According to this, as exemplified below, the position and size of the window 80 can be controlled after the window 80 is formed, and control for deleting the window 80 can be performed by a gesture operation.
- the window control operation is an operation for moving the window 80, for example.
- FIG. 30 shows a conceptual diagram of the window moving operation. According to the example of FIG. 30, by performing a drag operation in a state where a predetermined portion (for example, a frame portion of the window) of the window 80 is touched, the window 80 moves in the drag direction. According to this, since the drag operation is highly similar to an operation of moving an object on a desk in daily life, the window 80 can be moved intuitively. For this reason, high operability can be realized.
- the window control operation is an operation for changing the size of the window 80, for example.
- 31 to 33 show conceptual diagrams of such window size changing operation.
- the size of the window 80 is changed by a one-point moving type pinch operation in a state where a predetermined portion (for example, a frame portion of the window) of the window 80 is touched.
- a predetermined portion for example, a frame portion of the window
- the window 80 is enlarged (see FIGS. 31 to 33), and if it is pinch in, the window 80 is reduced. That is, whether to enlarge or reduce is instructed according to the pinch direction.
- the window size changing operation has high similarity or continuity with the display size changing operation for the display information inside and outside the window. As a result, it is possible to prevent the user from getting lost in the operation and shorten the operation time. That is, for this reason, high operability can be realized.
- the enlargement direction and the reduction direction that is, the deformation direction are indicated by the pinch direction.
- pinching out is performed in the left direction, and the window 80 extends in the left direction.
- pinching out is performed in the downward direction, and the window 80 extends downward.
- pinch out is performed in the diagonally downward left direction, and the window 80 extends in the left and downward directions. According to this, the deformation direction of the window 80 can be intuitively and easily instructed.
- FIGS. 31 and 32 illustrate a case where the starting point of the finger to be moved in the one-point moving type pinch operation is in the frame portion of the window 80, but is not limited to this example. . That is, as shown in FIG. 33, the starting point of finger movement may be placed inside the window 80. This is based on the fact that the finger to be fixed in the one-point moving type pinch operation is in the frame portion of the window 80, so that the window size changing operation can be distinguished from the operation on the display information in the window 80.
- the window control operation is, for example, an operation for deleting the window 80, in other words, an operation for ending the display of the window 80.
- FIG. 34 to FIG. 36 show conceptual diagrams of such window erasing operation.
- the window 80 is deleted. Since the flick operation is highly similar to the operation of flipping off objects on the desk and erasing them from view in daily life, the window 80 can be erased intuitively. For this reason, high operability can be realized.
- the flick direction is not limited to the example of FIG.
- a predetermined part of the window 80 (for example, a frame part of the window) is set as the start point of the flick.
- the start point of the flick for the window erasing operation may be in the window 80.
- a drag operation is performed so as to enter the window 80 from the outside of the window 80 and exit outside the window 80 on a side different from the entry side, in other words, to divide the window 80.
- the window 80 is erased. Since such a drag operation is highly similar to the operation of erasing unnecessary portions on a document by slashing them in daily life, the window 80 can be erased intuitively. For this reason, high operability can be realized.
- drag direction is not limited to the example of FIG.
- other gestures specifically flicks, may be employed for the window erasing operation in the example of FIG.
- the window 80 is erased by performing a drag operation so as to sandwich the window 80.
- a drag operation is highly similar to the operation of closing the window of the room and erasing the scenery outside the room from the field of view in daily life, so the window 80 can be erased intuitively. For this reason, high operability can be realized.
- a two-point movement type pinch-in is illustrated as a drag operation so as to sandwich the window 80, but a one-point movement type pinch-in may be employed. Further, the pinch-in direction is not limited to the example of FIG. Also, the end of the drag may be a flick.
- 30 to 34 illustrate the case where the upper layer corresponds to the first display information outside the window 80, but the window control operation is also performed when the first display information corresponds to the lower layer (see FIG. 22). Applicable.
- step S36 determines whether or not step S36 is deletion control of the window 80 in step S37.
- step S36 is not erasure control of the window 80
- the process of the information display apparatus 10 returns to step S31.
- step S36 is the window 80 deletion control
- the information display apparatus 10 ends the processing flow S30 of FIG. 23 and returns to the processing flow S10 of FIG. Note that the control unit 16 turns off the special operation mode when the process flow S30 ends.
- FIG. 37 illustrates a processing flow S50 related to automatic deletion of the window 80.
- the control unit 16 determines whether or not a state in which no user operation is input to the window inner area 82 has continued for a predetermined operation waiting time. If it is determined that such a no-operation state has continued, the control unit 16 deletes the window 80 from the display surface in step S52. After step S52, the processing of the information display device 10 returns to the processing flow S10 (see FIG. 15) until the window 80 is formed. Note that the control unit 16 turns off the special operation mode upon completion of the processing flow S50.
- step S50 is executed in parallel with the process flow S30 displaying the window 80. Specifically, step S51 is repeated until the operation waiting time elapses, and step S52 is executed as an interrupt process when the operation waiting time comes.
- the user does not need to perform a window erasing operation. Therefore, for example, in order to perform the window erasing operation, it is not necessary to interrupt the operation on the window outer area 84.
- the shape of the window 80 may be set according to the depression angle of the bird's eye view.
- the second display information in the window 80 may also be displayed by a bird's eye view expression of the same depression angle. According to this, by adopting the same expression format, the continuity between the first display information and the second display information is increased, and the user's cognitive load can be reduced.
- bird's-eye view expression Various methods of bird's-eye view expression are known, and such known methods are used here. For example, a method of converting an image drawn as a top view or a front view into a bird's eye view representation is employed. The generation of the bird's eye view image may be performed by the overall control unit 42, or may be performed by the first image forming unit 44 and the second image forming unit 48.
- the bird's-eye view expression can be applied not only to figures but also to characters and the like.
- the viewpoint in the bird's-eye view is set for the original image, and the parallel lines running in the vertical direction of the original image are bundled at the viewpoint. Deform.
- windows 80 are provided at various positions for the sake of convenience in order to facilitate understanding of the bird's-eye view conversion.
- 39 and FIG. 40 have different viewpoint positions, and a broken-line rectangle in FIG. 40 corresponds to the window 80 in FIG.
- the rectangular range (see FIGS. 16 and 17) designated by the user by the window opening operation is converted into a substantially trapezoid when the control unit 16 converts it into a bird's-eye view expression.
- the user does not need to specify the formation range of the window 80 while being aware of the bird's-eye view expression of the first display information. That is, if a normal rectangular range is designated, the control unit 16 automatically forms a window 80 that matches the bird's-eye view.
- the trapezoidal window 80 may be deformed.
- the trapezoidal side is curved.
- FIG. 42 shows a further example of the shape of the window 80.
- the first display information is visualized by a map, and a window 80 is formed in accordance with a section or the like in the map.
- the control unit 16 converts the range specified by the window opening operation according to the following first rule and second rule, and forms the window 80 in the converted range.
- the first rule is that the periphery of the range specified by the window opening operation (drag trajectory 70 in FIG. 42), the boundary of the partition in the map, and the periphery of the map display area (the entire display surface 32 in the example of FIG. 42). In addition, the content is to be deformed together.
- the user-specified range is expanded so as to inflate a balloon, and the periphery of the user-specified range is made coincident with the partition boundary in the map.
- the partition boundaries are roads, rivers, administrative partitions, and the like. The reason why the first rule considers the periphery of the map display area in addition to the partition boundaries in the map is to prevent the window 80 from exceeding the display area as a result of the expansion of the user-specified range.
- the content of the second rule is that the user-specified range converted in accordance with the first rule includes a range originally specified by the user in a predetermined ratio or more.
- the second rule is to set an upper limit value in order to prevent the window 80 from becoming too larger than the range specified by the user.
- the window 80 is formed so as to accommodate the entire initial user-specified range.
- the window 80 may be partially retracted from the initial user-specified range. For example, in order to find a partition boundary that satisfies the second rule, it may occur that a partition boundary at a position backward from the partition boundary selected as a candidate is selected again.
- the adoption of the first and second rules described above eliminates the need for the user to track complicated partition boundaries. That is, if a normal rectangular range is designated, the control unit 16 automatically forms a window 80 that matches the partition boundary. In addition, a large section obtained by combining a plurality of adjacent sections can be easily specified.
- the function to be executed is assigned differently for the user operation on the window inner area 82 and the user operation for the window outer area 84. Therefore, high operability, in other words, high convenience can be provided. In addition, the above various effects can be obtained.
- ⁇ Modification> Although the case where there is one window 80 has been illustrated above, a plurality of windows 80 may exist on the display surface 32 at the same time.
- a window 80 may be formed in the existing window 80 by performing a window opening operation in the window inner area 82 (window multiplexing). In this case, by grasping the area inside the further window 80 as the new window inside area 82 and grasping the area outside the further window 80 and inside the existing window 80 as the new window outside area 84, All of the above description also applies to window multiplexing.
- a contact type touch panel is exemplified as the input unit 14.
- a non-contact type also referred to as a three-dimensional (3D) type
- 3D three-dimensional
- a detectable region of the sensor group (in other words, an input region that can accept user input) is provided as a three-dimensional space on the input surface, and a finger in the three-dimensional space is placed on the input surface. The position projected on is detected.
- Some non-contact types can detect the distance from the input surface to the finger. According to this method, the finger position can be detected as a three-dimensional position, and further, the approach and retreat of the finger can also be detected.
- Various systems have been developed as non-contact type touch panels. For example, a projection capacity system which is one of electrostatic capacity systems is known.
- the finger is exemplified as the indicator used by the user for input.
- a part other than the finger can be used as the indicator.
- a tool such as a touch pen (also referred to as a stylus pen) may be used as an indicator.
- a so-called motion sensing technology may be used for the input unit 14.
- Various methods have been developed as motion sensing technology. For example, a method is known in which a user's movement is detected by a user holding or wearing a controller equipped with an acceleration sensor or the like.
- a method of extracting a feature point such as a finger from a captured image of a camera and detecting a user's movement from the extraction result is known.
- An intuitive operation environment is also provided by the input unit 14 using the motion sensing technology.
- the input / display unit 20 is illustrated above, the display unit 12 and the input unit 14 may be arranged separately. Even in this case, an intuitive operation environment is provided by configuring the input unit 14 with a touch panel or the like.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Automation & Control Theory (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
La présente invention concerne un dispositif d'affichage d'informations comprenant une unité d'affichage qui comporte une surface d'affichage, une unité d'entrée qui reçoit des opérations d'utilisateur, et une unité de commande. Si une opération d'utilisateur consiste en une opération d'ouverture de fenêtre donnant une instruction de formation de fenêtre sur la surface d'affichage et indiquant la plage de formation de ladite fenêtre, alors l'unité de commande forme une fenêtre correspondant à la plage de formation indiquée par l'opération d'ouverture de fenêtre. L'unité de commande associe des fonctions différentes devant être réalisées pour des opérations d'utilisateur sur la région intra-fenêtre, qui correspond à la région à l'intérieur de la fenêtre, et pour des opérations d'utilisateur sur la région hors-fenêtre, qui correspond à la région à l'extérieur de la fenêtre.
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2014541844A JP5921703B2 (ja) | 2012-10-16 | 2012-10-16 | 情報表示装置および情報表示装置における操作制御方法 |
| PCT/JP2012/076677 WO2014061095A1 (fr) | 2012-10-16 | 2012-10-16 | Dispositif d'affichage d'informations et procédé de commande d'opérations dans un dispositif d'affichage d'informations |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2012/076677 WO2014061095A1 (fr) | 2012-10-16 | 2012-10-16 | Dispositif d'affichage d'informations et procédé de commande d'opérations dans un dispositif d'affichage d'informations |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2014061095A1 true WO2014061095A1 (fr) | 2014-04-24 |
Family
ID=50487684
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2012/076677 Ceased WO2014061095A1 (fr) | 2012-10-16 | 2012-10-16 | Dispositif d'affichage d'informations et procédé de commande d'opérations dans un dispositif d'affichage d'informations |
Country Status (2)
| Country | Link |
|---|---|
| JP (1) | JP5921703B2 (fr) |
| WO (1) | WO2014061095A1 (fr) |
Cited By (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2016035706A (ja) * | 2014-08-04 | 2016-03-17 | パナソニックIpマネジメント株式会社 | 表示装置、表示制御方法、及び表示制御プログラム |
| JP2016035705A (ja) * | 2014-08-04 | 2016-03-17 | パナソニックIpマネジメント株式会社 | 表示装置、表示制御方法、及び表示制御プログラム |
| JP2016062393A (ja) * | 2014-09-19 | 2016-04-25 | コニカミノルタ株式会社 | 操作画面表示装置および表示プログラム |
| US10885366B2 (en) | 2018-02-20 | 2021-01-05 | Fujitsu Limited | Input information management apparatus and input information management method |
| JP2022547667A (ja) * | 2019-12-30 | 2022-11-15 | 華為技術有限公司 | ヒューマンコンピュータインタラクション方法、装置、及びシステム |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2002149338A (ja) * | 2000-11-15 | 2002-05-24 | Sony Corp | 情報処理装置および情報処理方法、並びにプログラム格納媒体 |
| WO2010092993A1 (fr) * | 2009-02-13 | 2010-08-19 | 株式会社 東芝 | Dispositif de traitement d'informations |
| JP2011252970A (ja) * | 2010-05-31 | 2011-12-15 | Brother Ind Ltd | 画像表示装置、画像表示方法、及び画像表示プログラム |
Family Cites Families (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH1195912A (ja) * | 1997-09-22 | 1999-04-09 | Sanyo Electric Co Ltd | 座標入力装置、座標入力方法及び座標入力プログラムを記録したコンピュータ読み取り可能な記録媒体 |
| JPH11119901A (ja) * | 1997-10-16 | 1999-04-30 | Seiko Denshi Kiki Kk | 座標読取システム、その座標変換方法、インタフェース装置および入力方法 |
| JP4244075B2 (ja) * | 1998-03-12 | 2009-03-25 | 株式会社リコー | 画像表示装置 |
| JP2007011953A (ja) * | 2005-07-04 | 2007-01-18 | Genki Kk | 情報処理装置、コマンド入力方法及びプログラム |
| JP2008012199A (ja) * | 2006-07-10 | 2008-01-24 | Aruze Corp | ゲーム装置及びゲーム装置の画像表示制御方法 |
| JP2009282939A (ja) * | 2008-05-26 | 2009-12-03 | Wacom Co Ltd | グラフィックタブレットを関連ディスプレイにマッピングする装置、方法、およびコンピュータ可読媒体 |
| JP2010134625A (ja) * | 2008-12-03 | 2010-06-17 | Sharp Corp | 電子機器、表示制御方法、およびプログラム |
| JP5304707B2 (ja) * | 2010-03-31 | 2013-10-02 | アイシン・エィ・ダブリュ株式会社 | 表示装置、及び、プログラム |
| JP5718042B2 (ja) * | 2010-12-24 | 2015-05-13 | 株式会社ソニー・コンピュータエンタテインメント | タッチ入力処理装置、情報処理装置およびタッチ入力制御方法 |
-
2012
- 2012-10-16 JP JP2014541844A patent/JP5921703B2/ja not_active Expired - Fee Related
- 2012-10-16 WO PCT/JP2012/076677 patent/WO2014061095A1/fr not_active Ceased
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2002149338A (ja) * | 2000-11-15 | 2002-05-24 | Sony Corp | 情報処理装置および情報処理方法、並びにプログラム格納媒体 |
| WO2010092993A1 (fr) * | 2009-02-13 | 2010-08-19 | 株式会社 東芝 | Dispositif de traitement d'informations |
| JP2011252970A (ja) * | 2010-05-31 | 2011-12-15 | Brother Ind Ltd | 画像表示装置、画像表示方法、及び画像表示プログラム |
Cited By (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2016035706A (ja) * | 2014-08-04 | 2016-03-17 | パナソニックIpマネジメント株式会社 | 表示装置、表示制御方法、及び表示制御プログラム |
| JP2016035705A (ja) * | 2014-08-04 | 2016-03-17 | パナソニックIpマネジメント株式会社 | 表示装置、表示制御方法、及び表示制御プログラム |
| JP2016062393A (ja) * | 2014-09-19 | 2016-04-25 | コニカミノルタ株式会社 | 操作画面表示装置および表示プログラム |
| US10885366B2 (en) | 2018-02-20 | 2021-01-05 | Fujitsu Limited | Input information management apparatus and input information management method |
| JP2022547667A (ja) * | 2019-12-30 | 2022-11-15 | 華為技術有限公司 | ヒューマンコンピュータインタラクション方法、装置、及びシステム |
| JP7413513B2 (ja) | 2019-12-30 | 2024-01-15 | 華為技術有限公司 | ヒューマンコンピュータインタラクション方法、装置、及びシステム |
Also Published As
| Publication number | Publication date |
|---|---|
| JP5921703B2 (ja) | 2016-05-24 |
| JPWO2014061095A1 (ja) | 2016-09-05 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP5738495B2 (ja) | 情報表示装置および表示情報操作方法 | |
| JP5738494B2 (ja) | 情報表示装置および表示情報操作方法 | |
| US20240152223A1 (en) | Systems and Methods for Interacting with Multiple Applications that are Simultaneously Displayed on an Electronic Device with a Touch-Sensitive Display | |
| CN106662964B (zh) | 应用窗口的动态联合划分器 | |
| JP5684291B2 (ja) | オンおよびオフスクリーン・ジェスチャーの組み合わせ | |
| CN106537317B (zh) | 应用窗口的自适应大小调整和定位 | |
| US10318146B2 (en) | Control area for a touch screen | |
| JP5883400B2 (ja) | オンスクリーン入力を作るためのオフスクリーン・ジェスチャー | |
| JP6585876B2 (ja) | マルチディスプレイ装置およびその制御方法 | |
| CN106537318B (zh) | 应用窗口的辅助呈现 | |
| US9250729B2 (en) | Method for manipulating a plurality of non-selected graphical user elements | |
| US20100321319A1 (en) | Method for displaying and updating a view of a graphical scene in response to commands via a touch-sensitive device | |
| EP3590034B1 (fr) | Système et méthode pour interagir avec plusieurs applications affichées simultanément sur un appareil électronique avec écran tactile | |
| JP6303772B2 (ja) | 入力制御装置、制御方法および制御プログラム | |
| JP2009025920A (ja) | 情報処理装置及びその制御方法、コンピュータプログラム | |
| KR20130099186A (ko) | 표시 장치, 유저 인터페이스 방법, 및 프로그램 | |
| KR101586559B1 (ko) | 정보 처리 장치 및 정보 처리 방법 | |
| JP5921703B2 (ja) | 情報表示装置および情報表示装置における操作制御方法 | |
| JP6000367B2 (ja) | 情報表示装置および情報表示方法 | |
| US20240184443A1 (en) | Display control method, electronic device, and readable storage medium | |
| US10402071B1 (en) | Simultaneous zoom in windows on a touch sensitive device |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12886816 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 2014541844 Country of ref document: JP Kind code of ref document: A |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 12886816 Country of ref document: EP Kind code of ref document: A1 |