US9557815B2 - Electronic device - Google Patents
Electronic device Download PDFInfo
- Publication number
- US9557815B2 US9557815B2 US13/950,751 US201313950751A US9557815B2 US 9557815 B2 US9557815 B2 US 9557815B2 US 201313950751 A US201313950751 A US 201313950751A US 9557815 B2 US9557815 B2 US 9557815B2
- Authority
- US
- United States
- Prior art keywords
- user
- area
- reference area
- key input
- electronic device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
Definitions
- the present disclosure relates to an electronic device which permits touch operation by a user.
- a so-called software keyboard is often used, which allows text input to be made via a touch on a keyboard that is displayed on the screen.
- a software keyboard does not present a contoured surface as do actual keyboards, there is a problem in that a reference area(s) on the keyboard (i.e., a home position(s), such as the positions of “F” and “J” on a QWERTY keyboard, or the position of “5” on a numeric keypad) is/are difficult to recognize.
- Patent Document 1 discloses a technique of presenting tactile sensations, via different vibrations depending on whether a home position key is touched or any key other than the home position key is touched, thus allowing the user to recognize the home position.
- the present disclosure provides an electronic device which can easily perform an input operation that is intended by a user.
- An electronic device comprises: a display section for displaying an operation area including a reference area and a non-reference area; a touch panel provided on a display surface side of the display section; an informing section for informing a user of an operation status of the user; a control section for controlling an operation of the informing section; and a distinction section for distinguishing between a first operation where the user searches for the reference area on the touch panel and a second operation which is a key input operation to the operation area, wherein the control section controls the operation of the informing section based on a result of distinction by the distinction section.
- a program is a program for causing an electronic device to execute an operation of informing a user of an operation status of the user on a touch panel, the program causing the electronic device to execute: a step of distinguishing between a first operation where the user searches for a reference area on the touch panel and a second operation which is a key input operation to the operation area; and a step of controlling an operation of informing the user based on a result of distinction.
- an operation where a user searches for a reference area on a touch panel is distinguished from a key input operation to an operation area. This prevents an inadvertent key input from being made when a key that is not intended for input is touched. Moreover, by controlling the operation of an informing section based on the result of distinction, the user is able to easily recognize the current operation status.
- FIG. 1 is a perspective view showing the appearance of an electronic device according to an embodiment.
- FIG. 2 is a block diagram showing the construction of an electronic device according to an embodiment.
- FIG. 3 is a cross-sectional view of an electronic device according to an embodiment.
- FIG. 4 is a perspective view showing the appearance of a vibration section according to an embodiment.
- FIG. 5 is a diagram showing an exemplary displayed image of an electronic device according to an embodiment.
- FIG. 6 is a flowchart showing processing by an electronic device according to an embodiment.
- FIGS. 7A and 7B are diagrams showing vibration patterns according to an embodiment.
- FIG. 8 is a diagram showing an exemplary displayed image on an electronic device according to an embodiment.
- FIG. 9 is a flowchart showing processing by an electronic device according to an embodiment.
- FIG. 10 is a flowchart showing processing by an electronic device according to an embodiment.
- FIG. 11 is a flowchart showing processing by an electronic device according to an embodiment.
- FIG. 12 is a diagram showing an exemplary displayed image on an electronic device according to an embodiment.
- FIG. 13 is a flowchart showing processing by an electronic device according to an embodiment.
- Patent Document 1 depending on whether the home position key is touched or any key other than the home position key is touched, tactile sensations through different vibrations are presented.
- any text input on a software keyboard is to be made via a touch, even when the wrong key (that was not intended for the text input) is touched, a character which is designated for that wrong key will be input.
- a motion of lightly swiping across the keyboard surface with a finger may be made in search of a home position; however, even such a swiping motion by itself may result in an inadvertent text input.
- Patent Document 1 fails to disclose anything about making a distinction between a finger-swiping motion in search of a home position and a motion of striking the keyboard for making an input. Therefore, the construction of Patent Document 1 may allow an input operation to inadvertently occur during a motion of searching for a home position, whereby an undesired input may be performed.
- the present disclosure provides an electronic device which can easily perform an input operation that is intended by a user.
- FIG. 1 is a perspective view showing the appearance of the electronic device 10 .
- FIG. 2 is a block diagram showing the construction of the electronic device 10 .
- the electronic device 10 mainly includes: a display section 12 ; a touch panel 11 covering the display section 12 ; a vibration section 13 for vibrating the touch panel 11 ; a vibration control section 33 for controlling the vibration of the vibration section 13 ; and an acceleration sensor 16 for measuring any acceleration of the electronic device 10 that may occur in response to a touch operation by a user.
- the display section 12 displays characters, numbers, figures, keyboards, and the like.
- the display section 12 includes an operation area 45 which is a region that is manipulable by the user.
- the operation area 45 displays an image of something which accepts an input made by the user, e.g., a keyboard. By performing a touch operation at an arbitrary position on the keyboard which is displayed in the display section 12 , the user is able to make a text input, etc.
- the operation area 45 further includes a reference area(s) and a non-reference area(s), which will be described in detail later.
- a display device utilizing any known technique may be used, e.g., liquid crystal, organic EL, electronic paper, or plasma.
- the display control section 32 controls the content displayed by the display section 12 based on a control signal which is generated by the microcomputer 20 .
- the touch panel 11 is disposed on the display surface of the display section 12 , so as to at least cover the operation area 45 .
- the user is able to manipulate the electronic device 10 through a touch operation on the touch panel 11 with a finger, a pen, or the like.
- the touch panel 11 is able to detect a touched position of the user.
- the information of the user's touched position is sent to the microcomputer 20 , via a touch panel control section 31 . By using the information of the user's touched position, the microcomputer 20 performs various processes described later.
- the touch panel 11 may be a touch panel of, for example, an electrostatic type, a resistive membrane type, an optical type, an ultrasonic-electromagnetic type, etc.
- the vibration section 13 vibrates the touch panel 11 .
- the vibration section 13 is an example of a tactile presentation section which provides the user with tactile sensations.
- the vibration control section 33 controls the vibration pattern of the vibration section 13 . The construction of the vibration section 13 and the detailed vibration patterns will be described later.
- a camera 15 which is mounted on the electronic device 10 is controlled by a camera control section 35 .
- An external communications section 36 is a communications means which enables communications over the Internet, communications with a personal computer, and so on, for example.
- the acceleration sensor 16 measures an acceleration of the electronic device 10 that occurs in response to a touch operation by the user. Specifically, when the user performs an operation of swiping across the touch panel surface with a finger, little impact is caused by the user's operation on the electronic device 10 , so that only a small acceleration is measured by the acceleration sensor 16 . On the other hand, when the user inputs a character by touching on an arbitrary character displayed in the operation area 45 , the user's finger undergoes a motion of tapping on the touch panel 11 . In such a motion, the user's operation causes a substantial impact on the electronic device 10 , thus resulting in a substantial acceleration being measured by the acceleration sensor 16 .
- the acceleration sensor 16 is able to measure the acceleration occurring in response to such an operation of the user.
- the acceleration which has been measured by the acceleration sensor 16 is sent to the microcomputer 20 .
- the microcomputer 20 Based on the magnitude of this acceleration, the microcomputer 20 sends a control signal concerning a vibration pattern to the vibration control section 33 .
- the vibration control section 33 vibrates the vibration section 13 in a vibration pattern which is varied in accordance with the magnitude of acceleration.
- the acceleration sensor 16 is an example of a distinction section for distinguishing a first operation of the user searching for a reference area on the touch panel 11 from a second operation which is an actual key input operation made to the operation area 45 .
- the electronic device 10 includes a loudspeaker 17 for generating audio and an input/output section 37 of any of various types which is capable of handling input/output from or to various electronic devices.
- FIG. 3 is a cross-sectional view of the electronic device 10 .
- the electronic device 10 of the present embodiment is constructed so that the display section 12 , the vibration section 13 , and the circuit board 19 are accommodated in the housing 14 .
- the microcomputer 20 , a RAM 39 , a ROM 38 , various control sections, a power supply, and the like are disposed on the circuit board 19 .
- the vibration section 13 which is mounted on the touch panel, vibrates the touch panel 11 so as to allow the user to experience tactile sensations.
- the touch panel 11 is disposed on the housing 14 via spacers 18 , the spacers 18 facilitating transmission of the vibration of the touch panel to the housing 14 .
- the spacers 18 may be cushioning members of silicone rubber, urethane rubber, or the like, for example.
- the display section 12 is placed within the housing 14 , and the touch panel 11 covers the display section 12 .
- the touch panel 11 , the vibration section 13 , and the display section 12 are electrically connected to a circuit board.
- FIG. 4 is a perspective view of the vibration section 13 according to the present embodiment.
- the vibration section 13 includes piezoelectric elements 21 , a shim 22 , and bases 23 .
- the piezoelectric elements 21 are adhesively bonded.
- Both ends of the shim 22 are connected to the bases 23 , thus realizing a so-called simple beam construction.
- the bases 23 are connected to the touch panel 11 .
- the piezoelectric elements 21 are pieces of a piezoelectric ceramic such as lead zirconate titanate or a piezoelectric single crystal such as lithium niobate. With a voltage being applied from the vibration control section 33 , the piezoelectric elements 21 expand or contract. By controlling them so that one of the piezoelectric elements 21 , attached on both sides of the shim 22 , expands while the other shrinks, flexural vibrations are caused in the shim.
- the shim 22 is a spring member of e.g. phosphor bronze.
- the vibration of the shim 22 causes the touch panel 11 to also vibrate, whereby the user operating on the touch panel is able to detect the vibration of the touch panel.
- the bases 23 are made of a metal such as aluminum or brass, or a plastic such as PET or PP.
- the frequency, amplitude, and period of the vibration are controlled by the vibration control section 33 .
- the frequency of vibration is about 100 to 400 Hz.
- the piezoelectric elements 21 may be attached directly onto the touch panel 11 .
- a thin-film piezoelectric member may be formed on the touch panel 11 by a method such as sputtering, so as to be used as the vibration section 13 .
- the piezoelectric elements 21 may be attached on that cover member.
- a vibration motor may be used.
- FIG. 5 is an exemplary displayed image on the electronic device 10 , representing a numeric keypad.
- the user By touching on the numeric keypad displayed in the display section 12 of the electronic device 10 , the user enters a number, which is then displayed in an input area 46 .
- an input area 46 In the display section 12 are displayed a home position key 41 , ordinary keys 42 , an ENTER key 43 , an END key 44 , and the input area 46 .
- An operation area (effective area) 45 defines a region containing the home position key 41 , the ordinary keys 42 , the ENTER key 43 , and the END key 44 .
- the home position key 41 is the “5” key, which is a key in the central position of the numeric keypad.
- the home position key 41 is exemplary of a reference area, whereas the ordinary keys 42 are exemplary of non-reference areas.
- the ordinary keys 42 are any numeric keys other than the home position key 41 . By striking the home position key 41 or an ordinary key 42 , the user is able to input a number in the input area 46 .
- the ENTER key 43 is a key for finalizing what has been input by the user. After a number is input, if the user strikes the ENTER key 43 , the number which has been input in the input area 46 is sent as information to the microcomputer 20 . This ends the input operation, after which the electronic device 10 will follow instructions from the microcomputer 20 .
- the END key 44 is a key for ending the input operation. At any time during the number inputting or after the number inputting, the user is able to end the input operation by striking the END key 44 , without allowing numeric information to be sent to the microcomputer 20 .
- FIG. 6 is a flowchart showing a flow of processes of text inputting in the present embodiment, where “S” stands for “step”.
- the microcomputer 20 determines whether the user's touch constitutes a swiping operation or a keystroke operation, based on a result from the distinction section.
- the distinction section is exemplified as the acceleration sensor 16 , which senses any acceleration in a perpendicular direction to the display surface of the display section 12 . If the acceleration exceeds a certain threshold value, the microcomputer 20 recognizes a keystroke operation; if it is equal to or less than the threshold value, the microcomputer 20 recognizes a swiping operation.
- control proceeds to S 17 .
- S 17 information of the position on the touch panel 11 which is being swiped on by the user is acquired from the touch panel control section 31 , and the microcomputer 20 determines whether the position is on the home position key 41 or not.
- control proceeds to S 18 .
- the vibration control section 33 controls the vibration section 13 to present tactile sensation A to the user. By feeling tactile sensation A through the touching finger, the user is able to know that the finger is on the home position key 41 .
- control returns to S 11 to wait for a touch operation to occur.
- control proceeds to S 13 .
- the microcomputer 20 determines whether the position of the user's keystroke is within the operation area 45 . If it is determined that the user has struck anywhere other than the operation area 45 , control returns to S 11 to again wait until a touch occurs.
- control proceeds to S 14 .
- the vibration control section 33 controls the vibration section 13 to present tactile sensation B to the user. By feeling tactile sensation B through the touching finger, the user is able to know that a text input or a termination process has been made.
- control proceeds to S 16 .
- S 16 a text input operation of causing a number which is indicated at the struck position to be displayed in the input area 46 is performed, and control returns to S 11 to wait for a next touch operation.
- FIGS. 7A and 7B are schematic illustrations of exemplary vibration patterns according to Embodiment 1.
- the vibration control section 33 Based on an instruction from the microcomputer 20 , the vibration control section 33 applies a voltage of a waveform as shown in FIG. 7A to the vibration section 13 so as to vibrate the touch panel 11 , thereby presenting tactile sensation A to the user.
- the voltage for presenting tactile sensation A may be a sine wave of 150 Hz, 70 Vrms, 2 cycles. In this case, there is about a 5 ⁇ m amplitude on the touch panel 11 .
- the vibration control section 33 applies a voltage as shown in FIG. 7B to the vibration section 13 so as to vibrate the touch panel 11 , thereby presenting tactile sensation B to the user.
- the voltage for presenting tactile sensation B may be a sine wave of 300 Hz, 100 Vrms, 4 cycles.
- the frequency, voltage, and number of cycles are only exemplary; another waveform such as a rectangular wave or a sawtooth wave, an intermittent waveform, a waveform with gradually changing frequency or amplitude, etc., may also be used.
- vibration pattern for presenting tactile sensation A and the vibration pattern for presenting tactile sensation B are distinct vibration patterns, this is not a. limitation.
- the vibration patterns of tactile sensation A and tactile sensation B may be identical:
- Embodiment 2 Since the electronic device of Embodiment 2 is similar in construction to the electronic device of Embodiment 1 described above, any repetitive description will be omitted, and the differences from Embodiment 1 will be described.
- FIG. 8 shows an exemplary displayed image on the electronic device 10 , where QWERTY keys are displayed in the display section 12 .
- the user By touching the QWERTY keys indicated in the display section 12 of the electronic device 10 , the user inputs characters to the input area 46 , which become displayed.
- the display section 12 In the display section 12 are displayed home position keys 41 , ordinary keys 42 , an ENTER key 43 , an END key 44 , and the input area 46 .
- the operation area 45 defines a region containing the home position keys 41 , the ordinary keys 42 , the ENTER key 43 , and the END key 44 .
- the home position keys 41 are two keys near the middle of the QWERTY keys, i.e., the “F” key and the “J” key.
- the microcomputer 20 enables an input mode. As a result, text inputting via the user's strokes is enabled, whereby text input is received.
- the microcomputer 20 determines that the user does not recognize the home position keys 41 , and disables the input mode. As a result, text inputting via the user's strokes is disabled, whereby text input is no longer received.
- the ordinary keys 42 are any character keys other than the home position keys 41 . By striking a home position key 41 or an ordinary key 42 , the user is able to input a character to the input area 46 .
- the ENTER key 43 is a key for finalization.
- a character which is input to the input area 46 is sent as information to the microcomputer 20 .
- the input operation is ended, and the electronic device 10 follows instructions from the microcomputer 20 .
- the END key 44 is a key which ends an input operation. At any time during the text inputting or after the text inputting, by striking the END key 44 , the user is able to end the input operation, without allowing text information to the microcomputer 20 .
- FIG. 9 is a flowchart showing a text input process according to the present embodiment, where “S” stands for “step”.
- the microcomputer 20 determines whether the user's touch constitutes a swiping operation or a keystroke operation based on a result from the distinction section.
- the distinction section is exemplified as the acceleration sensor 16 , which senses any acceleration in a perpendicular direction to the display surface of the display section 12 . If the acceleration exceeds a certain threshold value, the microcomputer 20 recognizes a keystroke operation; if it is equal to or less than the threshold value, the microcomputer 20 recognizes a swiping operation.
- control proceeds to S 30 .
- S 30 information of the position on the touch panel 11 which is being swiped on by the user is acquired from the touch panel control section 31 , and it is determined whether the position is on either home position key 41 .
- control proceeds to S 31 .
- the vibration control section 33 controls the vibration section 13 to present tactile sensation A to the user. By feeling tactile sensation A through the touching finger, the user is able to know that the finger is on a home position key 41 .
- an input mode enablement determination process is performed at S 32 , and control returns to S 21 . The input mode enablement determination process will be described later.
- control returns to S 21 to wait for a touch operation to be recognized.
- control proceeds to S 23 .
- S 23 based on information from the touch panel control section 31 , it is determined whether the position of the user's keystroke is within the operation area 45 . If it is determined that the user has struck anywhere other than the operation area 45 , control returns to S 11 to again wait until a touch occurs.
- control proceeds to S 24 .
- S 24 based on information from the touch panel control section 31 , it is determined whether the position of the user's keystroke is on the END key 44 or the ENTER key 43 .
- the vibration control section 33 controls the vibration section 13 to present tactile sensation C to the user. By feeling tactile sensation C through the touching finger, the user is able to know that a termination process or a finalization of text input has been performed. Thereafter, if the position of the user's keystroke is on the END key 44 , the process is ended. If the position of the user's keystroke is on the ENTER key 43 , input information in the input area 46 is sent to the microcomputer 20 , and thereafter the process is ended.
- control returns to S 21 to wait for a next touch.
- the input mode is disabled, thus making any key input (text input) invalid. In this manner, no text input will be made even if a finger touches an ordinary key 42 before reaching a home position; therefore, the user is prevented from making any unintended text input.
- the input mode is enabled, by disabling text input during any period in which a swiping operation is being made, the user is prevented from making any unintended text input even if he or she makes a swiping operation of moving a finger back to a home position in the middle of a character input operation.
- FIG. 10 is a flowchart showing the input mode enablement determination process to be performed at S 32 .
- S 42 it is determined whether all home position keys 41 are being touched by the user.
- the microcomputer 20 may determine that both index fingers of the user have recognized “F” and “J” of the QWERY keys. If it is determined at S 42 that the user is touching all home position keys 41 (Yes from S 42 ), control proceeds to S 43 . If it is determined at S 42 that all home position keys 41 are not being touched (No from S 42 ), the input mode enablement determination process is ended.
- the duration for which the user's touched positions have remained on the home position keys 41 is equal to or longer than a predetermined time.
- the predetermined time is approximately 0.1 s to 2.0 s. If the user has been touching the home position keys 41 for the predetermined time or longer, the microcomputer 20 determines that the user has recognized the home position keys 41 . In order to determine whether the user's touches have remained on the home position keys 41 for the predetermined time or longer, the duration for which the home position keys 41 have been touched may be stored to the RAM 39 at S 30 .
- the input mode may become enabled when the user lifts the fingers off the home position keys 41 after having touched the home position keys 41 for the predetermined time or longer. This prevents the touched keys from being inadvertently input at the moment when the input mode switches from disabled to enabled.
- FIG. 11 is a flowchart showing the input mode disablement determination process at S 29 ( FIG. 9 ).
- the duration for which the user has not been touching the operation area 45 is stored in the RAM 39 .
- the predetermined time is approximately 1 s to 10 s. If the user has not been touching the operation area 45 for the predetermined time or longer, the microcomputer 20 determines that the user does not recognize the positions of the home position keys 41 . In order to be able to determine that the user has not been touching the operation area 45 for the predetermined time or longer, the duration for which the user has not been touching the operation area 45 may be stored to the RAM 39 at S 21 .
- the input mode is disabled at S 53 , and the fact that the input mode is disabled is stored to the RAM 39 . Then, the keyboard color is changed at S 54 to visually indicate that the input mode is now disabled, and the input mode disablement determination process is ended.
- the change in input mode may be indicated through an auditory or tactile sensation or the like.
- the electronic device 10 of Embodiment 3 differs from the electronic device 10 of Embodiment 1 in that vibration is generated even while a user is touching a button that is not in the home position (which exemplifies a non-reference area) in order to search for the home position (which exemplifies a reference area). Otherwise, the electronic device 10 of Embodiment 3 is similar in terms of construction and processing to the electronic device 10 of Embodiment 1, any repetitive description will be omitted, and the differences from the electronic device 10 of Embodiment 1 will be described.
- FIG. 13 is a flowchart showing processing by the electronic device 10 of Embodiment 3.
- control proceeds to S 19 .
- the vibration control section 33 controls the vibration section 13 to present tactile sensation C to the user.
- tactile sensation A is presented similarly to Embodiment 1.
- the user When the user makes a swiping operation in order to search for a home position, the user is able to feel different tactile sensations depending on whether the home position is touched or anywhere other than the home position is touched. As a result, the user is able to know whether he or she has touched the home position or not. Moreover, tactile sensation C presented to the user when the user touches anywhere other than the home position allows the user to realize that he or she is still searching for the home position.
- the vibration pattern of tactile sensation C may be set as appropriate.
- the cycle, amplitude, or the like of the vibration may be altered, which will allow the user to feel a stronger or weaker vibration.
- the vibration patterns for tactile sensation A, tactile sensation B, and tactile sensation C may take any of the following combinations.
- the vibration pattern of tactile sensation A, the vibration pattern of tactile sensation B, and the vibration pattern of tactile sensation C may all be an identical vibration pattern. This allows the user to know which operation he or she is making.
- an electronic device 10 includes a display section 12 , a touch panel 11 , a vibration section 13 , a vibration control section 33 , and an acceleration sensor 16 .
- the display section 12 displays an operation area 45 , which contains a home position key 41 an example of a reference area and ordinary keys 42 as examples of non-reference areas.
- the touch panel 11 is disposed so as to at least cover the operation area 45 .
- the vibration section 13 which is an example of an informing section, informs the user of an operation status of the user.
- the vibration control section 33 which is an example of a control section, controls the vibration section 13 .
- the acceleration sensor 16 which is an example of a distinction section, distinguishes between a first operation where the user searches for a reference area on the touch panel 11 and a second operation which is a key input operation made to the operation area 45 .
- the vibration control section 33 controls the vibration section 13 .
- a user's operation is distinguished by the acceleration sensor 16 , and a corresponding vibration is presented to the user. This allows the user to know which operation he or she has made, or whether his or her operation has been recognized by the electronic device 10 . As a result, the user is able to easily make an input to the electronic device 10 .
- the electronic device 10 includes a display section 12 , a touch panel 11 , a touch panel control section 31 , and a microcomputer 20 , which is an example of a control section.
- the display section 12 displays an operation area 45 , which contains home position keys 41 as examples of reference areas and ordinary keys 42 as examples of non-reference areas.
- the touch panel 11 is disposed so as to at least cover the operation area 45 .
- the touch panel control section 31 is able to distinguish whether user has touched the home position keys (exemplifying reference areas) or the ordinary keys 42 (exemplifying non-reference areas).
- the microcomputer 20 enables user input to the touch panel 11 .
- the above-described operation of the electronic device 10 may be implemented in hardware or software.
- a program for realizing such a control operation may be stored in an internal memory of the microcomputer 20 or a ROM 38 .
- such a computer program may be installed to the electrical device 10 from a storage medium (an optical disk, a semiconductor memory, etc.) on which it is recorded, or downloaded via telecommunication lines such as the Internet.
- Embodiments 1 to 3 have been illustrated as embodiments of the present disclosure, the embodiments are not limited thereto. For example, the following embodiment also falls within the present disclosure.
- the informing section is not limited to the vibration section 13 .
- the informing section may be a loudspeaker 17 which provides information to the user in the form of an audio.
- the informing section may be of a construction such that information is provided to the user in the form of light.
- Such a construction can be realized by the display control section 32 controlling the display section 12 , for example.
- the informing section may be of a construction such that information is provided to the user in the form of heat or an electric shock.
- FIG. 12 shows an exemplary displayed image on the electronic device 10 , which indicates operation keys of a car navigation system, a car radio, or the like.
- the user By touching the operation keys displayed in the display section 12 of the electronic device 10 , the user is able to manipulate the car navigation system, the car radio, or the like.
- a home position key 41 In the display section 12 , a home position key 41 , ordinary keys 42 , an ENTER key 43 , and an END key 44 are indicated.
- the operation area defines a region containing the home position key 41 , the ordinary keys 42 , the ENTER key 43 , and the END key 44 .
- the user may search for the home position key 41 through a swiping operation, infer the positions of various operation keys based on relative positions therefrom, and select an ordinary key through a keystroke operation, whereby the user can play, stop, or turn up or down the sound volume of music or video information, or change channels, etc.
- the distinction section is not limited to the acceleration sensor 16 .
- the vibration section 13 is a vibrator incorporating the piezoelectric elements 21
- the vibration of the piezoelectric elements 21 may be converted into a voltage through piezoelectric effects, this voltage being supplied to the vibration control section 33 .
- the user's keystroke operation is converted by the vibration section 13 into a voltage, which is then output to the vibration control section 33 .
- the microcomputer 20 is able to recognize that the user has made a keystroke operation.
- the determination as to whether the user's operation is a keystroke operation or a swiping operation may also be made by the touch panel control section 31 .
- the touch panel 11 is a resistive membrane type
- the area of contact between the upper and lower layers can be known from a resistance value; from the rate of temporal change in this area of contact, a swiping operation can be distinguished from a keystroke operation, or vice versa.
- the touch panel 11 is an electrostatic type
- the area of contact between the user's finger and the touch panel 11 , or whether the user's finger has approached the touch panel 11 or not can be recognized as a change in electrostatic capacity.
- a swiping operation can be distinguished from a keystroke operation, or vice versa. Specifically, a small change in electrostatic capacity occurs in a swiping operation, whereas a large change in electrostatic capacity occurs in a keystroke operation.
- a tablet-type personal digital assistant is illustrated as an example electronic device, the electronic device according to embodiments are not limited thereto.
- the embodiments are applicable to any electronic device having a touch panel, e.g., a mobile phone, a PDA, a game machine, a car navigation system, or an ATM.
- Embodiments 1 to 3 each illustrate a touch panel that covers the entire display surface of the display section 12 , this is not a limitation.
- a touch panel function may be provided only in a central portion of the display surface, while the peripheral portion may not be covered by anything that confers a touch panel function.
- the touch panel may at least cover the input operation area of the display section.
- the touch panel 11 and the display section 12 are illustrated as separate members in Embodiments 1 to 3, this is not a limitation.
- the touch panel 11 may be adhesively bonded to the display section 12 .
- the display section 12 may have a function of detecting touched positions. In other words, it suffices if touched positions can be detected.
- Embodiments 1 to 3 illustrate that the touch panel 11 is vibrated, this is not a limitation.
- the cover glass may be vibrated. In other words, it suffices if any member that is touched by the user is vibrated.
- an electronic device 10 comprises: a display section 12 for displaying an operation area 45 including a reference area 41 and a non-reference area 42 ; a touch panel 11 provided on a display surface side of the display section 12 ; an informing section 13 for informing a user of an operation status of the user; a control section 20 for controlling an operation of the informing section 13 ; and a distinction section 16 for distinguishing between a first operation where the user searches for the reference area 41 on the touch panel 11 and a second operation which is a key input operation to the operation area 45 , wherein the control section 20 controls the operation of the informing section 13 based on a result of distinction by the distinction section 16 .
- the informing section 13 is a tactile presentation section for presenting a tactile sensation to the user.
- the tactile presentation section is a vibration section for vibrating the touch panel 11 .
- the first operation is an operation where a touched position of the user on the touch panel 11 gradually changes; and the second operation is a keystroke operation on the touch panel 11 .
- control section 20 varies the operation of the vibration section depending on whether the user is touching the non-reference area 42 or touching the reference area 41 in the first operation.
- control section 20 disables key input to the operation area 45 during the first operation.
- the control section 20 enables key input to the operation area 45 , so that key input to the operation area 45 is no longer disabled.
- the control section 20 enables key input to the operation area 45 when the touch on the reference area 41 ceases, so that key input to the operation area 45 is no longer disabled.
- control section 20 when the user has not touched the operation area 45 for a predetermined time or longer, the control section 20 disables key input to the operation area 45 , so that key input to the operation area 45 is no longer enabled.
- control section 20 varies a vibration pattern of the vibration section depending on the operation being the first operation or the second operation.
- control section 20 controls the vibration section to vibrate in a first vibration pattern when the user touches the reference area 41 in the first operation; and the control section 20 controls the vibration section to vibrate in a second vibration pattern different from the first vibration pattern when the user touches the operation area 45 in the second operation.
- a program according to the an embodiment is a program for causing an electronic device 10 to execute an operation of informing a user of an operation status of the user on a touch panel 11 , the program causing the electronic device 10 to execute: a step of distinguishing between a first operation where the user searches for a reference area 41 on the touch panel 11 and a second operation which is a key input operation to the operation area 45 ; and a step of controlling an operation of informing the user based on a result of distinction.
- the present disclosure is useful for any electronic device that is capable of touch operation by the user.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Position Input By Displaying (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims (11)
Applications Claiming Priority (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2011-022377 | 2011-02-04 | ||
| JP2011022377 | 2011-02-04 | ||
| PCT/JP2012/000039 WO2012105158A1 (en) | 2011-02-04 | 2012-01-05 | Electronic device |
Related Parent Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2012/000039 Continuation WO2012105158A1 (en) | 2011-02-04 | 2012-01-05 | Electronic device |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| US20130307804A1 US20130307804A1 (en) | 2013-11-21 |
| US9557815B2 true US9557815B2 (en) | 2017-01-31 |
Family
ID=46602388
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US13/950,751 Active 2032-07-20 US9557815B2 (en) | 2011-02-04 | 2013-07-25 | Electronic device |
Country Status (3)
| Country | Link |
|---|---|
| US (1) | US9557815B2 (en) |
| JP (1) | JP5496337B2 (en) |
| WO (1) | WO2012105158A1 (en) |
Families Citing this family (14)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP5723832B2 (en) * | 2012-05-31 | 2015-05-27 | 京セラドキュメントソリューションズ株式会社 | Input device |
| US9268424B2 (en) * | 2012-07-18 | 2016-02-23 | Sony Corporation | Mobile client device, operation method, recording medium, and operation system |
| US8896524B2 (en) * | 2012-08-24 | 2014-11-25 | Immersion Corporation | Context-dependent haptic confirmation system |
| JP2014061753A (en) * | 2012-09-20 | 2014-04-10 | Yuhshin Co Ltd | On-vehicle equipment operating device |
| US9547430B2 (en) * | 2012-10-10 | 2017-01-17 | Microsoft Technology Licensing, Llc | Provision of haptic feedback for localization and data input |
| JP5913771B2 (en) * | 2013-04-01 | 2016-04-27 | レノボ・シンガポール・プライベート・リミテッド | Touch display input system and input panel display method |
| US10684774B2 (en) * | 2014-09-09 | 2020-06-16 | Touchtype Limited | Systems and methods for multiuse of keys for virtual keyboard and generating animation associated with a key |
| JP6483379B2 (en) * | 2014-09-10 | 2019-03-13 | 三菱電機株式会社 | Tactile sensation control system and tactile sensation control method |
| JP6097268B2 (en) * | 2014-12-02 | 2017-03-15 | レノボ・シンガポール・プライベート・リミテッド | Input device, software keyboard display method thereof, and computer-executable program |
| JP5891324B2 (en) * | 2015-03-25 | 2016-03-22 | 京セラドキュメントソリューションズ株式会社 | Input device |
| JP7142196B2 (en) * | 2016-12-27 | 2022-09-27 | パナソニックIpマネジメント株式会社 | ELECTRONIC DEVICE, TABLET TERMINAL, INPUT CONTROL METHOD, AND PROGRAM |
| AT520031A1 (en) * | 2017-06-07 | 2018-12-15 | Caretec Int Gmbh | Apparatus and methods of machine writing and virtual reading of volatile tactile characters and acoustic sounds |
| JP7321857B2 (en) * | 2019-09-18 | 2023-08-07 | 株式会社東芝 | optical imaging device |
| WO2022110117A1 (en) * | 2020-11-30 | 2022-06-02 | 京东方科技集团股份有限公司 | Display panel and driving method therefor, and display apparatus |
Citations (12)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20020149561A1 (en) | 2000-08-08 | 2002-10-17 | Masaaki Fukumoto | Electronic apparatus vibration generator, vibratory informing method and method for controlling information |
| JP2004110388A (en) | 2002-09-18 | 2004-04-08 | Sharp Corp | Device with touch panel |
| US20050085215A1 (en) * | 2003-10-21 | 2005-04-21 | Nokia Corporation | Method and related apparatus for emergency calling in a touch screen mobile phone from a touch screen and keypad lock active state |
| JP2005129072A (en) | 2004-11-05 | 2005-05-19 | Oki Electric Ind Co Ltd | Automatic transaction device and input unit |
| JP2005186847A (en) | 2003-12-26 | 2005-07-14 | Alpine Electronics Inc | Input control device and input accepting method |
| US20070091070A1 (en) * | 2005-10-20 | 2007-04-26 | Microsoft Corporation | Keyboard with integrated key and touchpad |
| US20070236474A1 (en) | 2006-04-10 | 2007-10-11 | Immersion Corporation | Touch Panel with a Haptically Generated Reference Key |
| JP2008033739A (en) | 2006-07-31 | 2008-02-14 | Sony Corp | Touch screen interaction method and apparatus based on force feedback and pressure measurement |
| US20090006991A1 (en) * | 2007-06-29 | 2009-01-01 | Nokia Corporation | Unlocking a touch screen device |
| US20100167693A1 (en) * | 2006-02-08 | 2010-07-01 | Eiko Yamada | Mobile terminal, mobile terminal control method, mobile terminal control program, and recording medium |
| US20100253652A1 (en) | 2009-04-03 | 2010-10-07 | Fuminori Homma | Information processing apparatus, notification method, and program |
| US20110167375A1 (en) * | 2010-01-06 | 2011-07-07 | Kocienda Kenneth L | Apparatus and Method for Conditionally Enabling or Disabling Soft Buttons |
Family Cites Families (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP4484255B2 (en) * | 1996-06-11 | 2010-06-16 | 株式会社日立製作所 | Information processing apparatus having touch panel and information processing method |
| JP3987182B2 (en) * | 1998-01-26 | 2007-10-03 | Idec株式会社 | Information display device and operation input device |
| JP2001356861A (en) * | 2000-06-13 | 2001-12-26 | Olympus Optical Co Ltd | Tactile sense presenting device and method |
| JP3673191B2 (en) * | 2001-06-27 | 2005-07-20 | 沖電気工業株式会社 | Automatic transaction equipment |
| JP3888099B2 (en) * | 2001-08-17 | 2007-02-28 | 富士ゼロックス株式会社 | Touch panel device |
| JP4500485B2 (en) * | 2002-08-28 | 2010-07-14 | 株式会社日立製作所 | Display device with touch panel |
| JP4358846B2 (en) * | 2006-08-15 | 2009-11-04 | 株式会社エヌ・ティ・ティ・ドコモ | Mobile terminal device and operation support method thereof |
| JP4897596B2 (en) * | 2007-07-12 | 2012-03-14 | ソニー株式会社 | INPUT DEVICE, STORAGE MEDIUM, INFORMATION INPUT METHOD, AND ELECTRONIC DEVICE |
| US9041660B2 (en) * | 2008-12-09 | 2015-05-26 | Microsoft Technology Licensing, Llc | Soft keyboard control |
-
2012
- 2012-01-05 WO PCT/JP2012/000039 patent/WO2012105158A1/en not_active Ceased
- 2012-01-05 JP JP2012528169A patent/JP5496337B2/en not_active Expired - Fee Related
-
2013
- 2013-07-25 US US13/950,751 patent/US9557815B2/en active Active
Patent Citations (16)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20020149561A1 (en) | 2000-08-08 | 2002-10-17 | Masaaki Fukumoto | Electronic apparatus vibration generator, vibratory informing method and method for controlling information |
| JP2004110388A (en) | 2002-09-18 | 2004-04-08 | Sharp Corp | Device with touch panel |
| US20050085215A1 (en) * | 2003-10-21 | 2005-04-21 | Nokia Corporation | Method and related apparatus for emergency calling in a touch screen mobile phone from a touch screen and keypad lock active state |
| US7345679B2 (en) | 2003-12-26 | 2008-03-18 | Alpine Electronics, Inc. | Input control apparatus and input accepting method |
| JP2005186847A (en) | 2003-12-26 | 2005-07-14 | Alpine Electronics Inc | Input control device and input accepting method |
| JP2005129072A (en) | 2004-11-05 | 2005-05-19 | Oki Electric Ind Co Ltd | Automatic transaction device and input unit |
| US20070091070A1 (en) * | 2005-10-20 | 2007-04-26 | Microsoft Corporation | Keyboard with integrated key and touchpad |
| US20100167693A1 (en) * | 2006-02-08 | 2010-07-01 | Eiko Yamada | Mobile terminal, mobile terminal control method, mobile terminal control program, and recording medium |
| US20070236474A1 (en) | 2006-04-10 | 2007-10-11 | Immersion Corporation | Touch Panel with a Haptically Generated Reference Key |
| JP2009533762A (en) | 2006-04-10 | 2009-09-17 | イマーション コーポレーション | Touch panel with tactilely generated reference keys |
| JP2008033739A (en) | 2006-07-31 | 2008-02-14 | Sony Corp | Touch screen interaction method and apparatus based on force feedback and pressure measurement |
| US7952566B2 (en) | 2006-07-31 | 2011-05-31 | Sony Corporation | Apparatus and method for touch screen interaction based on tactile feedback and pressure measurement |
| US20090006991A1 (en) * | 2007-06-29 | 2009-01-01 | Nokia Corporation | Unlocking a touch screen device |
| US20100253652A1 (en) | 2009-04-03 | 2010-10-07 | Fuminori Homma | Information processing apparatus, notification method, and program |
| JP2010244253A (en) | 2009-04-03 | 2010-10-28 | Sony Corp | Information processing apparatus, notification method, and program |
| US20110167375A1 (en) * | 2010-01-06 | 2011-07-07 | Kocienda Kenneth L | Apparatus and Method for Conditionally Enabling or Disabling Soft Buttons |
Non-Patent Citations (3)
| Title |
|---|
| Form PCT/ISA/237 for corresponding International Application No. PCT/JP2012/000039 dated Apr. 10, 2012 and partial English translation. |
| International Search Report for corresponding International Application No. PCT/JP2012/000039 mailed Apr. 10, 2012. |
| Notice of Reasons for Rejection for corresponding Japanese Patent Application No. 2012-528169 mailed Aug. 6, 2013. |
Also Published As
| Publication number | Publication date |
|---|---|
| JP5496337B2 (en) | 2014-05-21 |
| JPWO2012105158A1 (en) | 2014-07-03 |
| US20130307804A1 (en) | 2013-11-21 |
| WO2012105158A1 (en) | 2012-08-09 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US9557815B2 (en) | Electronic device | |
| JP5654114B2 (en) | Electronic device with touch sensor | |
| JP5738413B2 (en) | Electronics | |
| US10795492B2 (en) | Input device and method for controlling input device | |
| EP2383631A1 (en) | Hand-held mobile device and method for operating the hand-held mobile device | |
| JP5718475B2 (en) | Tactile presentation device | |
| JP2011048832A (en) | Input device | |
| JP2012032891A (en) | Input device | |
| JP6246640B2 (en) | Electronics | |
| JP2011048696A (en) | Input device | |
| JPWO2013046670A1 (en) | Tactile presentation device | |
| WO2011077612A1 (en) | Tactile indication device | |
| JP5539788B2 (en) | Tactile presentation device | |
| JP5497893B2 (en) | Tactile sensation presentation apparatus and control method | |
| JP5723832B2 (en) | Input device | |
| JP5243379B2 (en) | Input device | |
| JP2011048855A (en) | Input apparatus and control method of input apparatus | |
| JP5587596B2 (en) | Tactile presentation device | |
| JP6120898B2 (en) | Electronic device and control method of electronic device | |
| US9110508B2 (en) | Electronic device having a vibrating section for multiple touches | |
| JP5292244B2 (en) | Input device | |
| JP2011095925A (en) | Input device | |
| WO2012108184A1 (en) | Electronic device and computer program | |
| JP2011095926A (en) | Input device |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: PANASONIC CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ADACHI, YUSUKE;KOGA, AKIRA;INATA, MASAHIRO;AND OTHERS;SIGNING DATES FROM 20130621 TO 20130624;REEL/FRAME:032126/0985 |
|
| AS | Assignment |
Owner name: PANASONIC INTELLECTUAL PROPERTY MANAGEMENT CO., LTD., JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:PANASONIC CORPORATION;REEL/FRAME:034194/0143 Effective date: 20141110 Owner name: PANASONIC INTELLECTUAL PROPERTY MANAGEMENT CO., LT Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:PANASONIC CORPORATION;REEL/FRAME:034194/0143 Effective date: 20141110 |
|
| STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
| MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |
|
| AS | Assignment |
Owner name: PANASONIC INTELLECTUAL PROPERTY MANAGEMENT CO., LTD., JAPAN Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE ERRONEOUSLY FILED APPLICATION NUMBERS 13/384239, 13/498734, 14/116681 AND 14/301144 PREVIOUSLY RECORDED ON REEL 034194 FRAME 0143. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT;ASSIGNOR:PANASONIC CORPORATION;REEL/FRAME:056788/0362 Effective date: 20141110 |
|
| AS | Assignment |
Owner name: PANASONIC AUTOMOTIVE SYSTEMS CO., LTD., JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:PANASONIC INTELLECTUAL PROPERTY MANAGEMENT CO., LTD.;REEL/FRAME:066703/0209 Effective date: 20240207 |
|
| MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 8 |