[go: up one dir, main page]

WO2015068615A1 - Dispositif d'entrée tactile - Google Patents

Dispositif d'entrée tactile Download PDF

Info

Publication number
WO2015068615A1
WO2015068615A1 PCT/JP2014/078691 JP2014078691W WO2015068615A1 WO 2015068615 A1 WO2015068615 A1 WO 2015068615A1 JP 2014078691 W JP2014078691 W JP 2014078691W WO 2015068615 A1 WO2015068615 A1 WO 2015068615A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch
unit
input device
push
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2014/078691
Other languages
English (en)
Japanese (ja)
Inventor
北田宏明
加納英和
井上貴文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Murata Manufacturing Co Ltd
Original Assignee
Murata Manufacturing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Murata Manufacturing Co Ltd filed Critical Murata Manufacturing Co Ltd
Priority to JP2015546612A priority Critical patent/JP5971430B2/ja
Publication of WO2015068615A1 publication Critical patent/WO2015068615A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
    • G06F3/0445Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means using two or more layers of sensing electrodes, e.g. using two layers of electrodes separated by a dielectric layer
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04105Pressure sensors for measuring the pressure or force exerted on the touch surface without providing the touch position
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04106Multi-sensing digitiser, i.e. digitiser using at least two different sensing technologies simultaneously or alternatively, e.g. for detecting pen and finger, for saving power or for improving position detection
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
    • G06F3/0446Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means using a grid-like structure of electrodes in at least two directions, e.g. using row and column electrodes

Definitions

  • the present invention relates to a touch input device that accepts a touch operation.
  • Multifunctional mobile terminals that are widely used now have a function as a touch input device.
  • the user of the multi-function mobile terminal rotates or enlarges or reduces the image by moving a finger on the screen.
  • image rotation and enlargement / reduction are performed simultaneously.
  • a touch input device for example, there is an image editing device described in Patent Document 1.
  • a rotation function for rotating an image and an enlargement / reduction function for enlarging / reducing an image can be switched according to the operation mode of the touch operation.
  • the image When the image is rotated and enlarged / reduced simultaneously, the image can be rotated and enlarged / reduced in one operation. However, in this case, when the image is to be rotated, the position of the finger is shifted, so that the image may be unintentionally enlarged or reduced together with the rotation of the image. On the other hand, when only image rotation or enlargement / reduction is performed, image rotation or enlargement / reduction can be performed without an erroneous operation, but image rotation and enlargement / reduction cannot be performed simultaneously.
  • An object of the present invention is to provide a touch-type input device capable of switching between a mode in which only image rotation is performed, a mode in which only enlargement / reduction is performed, and a mode in which rotation and enlargement / reduction are performed simultaneously with good operability. It is in.
  • the touch-type input device of the present invention includes an operation surface, a push detection unit, a position detection unit, and an operation reception unit.
  • the indentation detection unit detects an indentation operation on the operation surface.
  • the position detection unit detects a plurality of positions where the touch operation is performed on the operation surface.
  • the operation reception unit receives at least one of a rotation operation and a pinch operation based on the change in position and the push-in operation.
  • the mode in which only the image is rotated, the mode in which only the enlargement / reduction is performed, and the mode in which the rotation and enlargement / reduction are simultaneously performed can be switched based on the change in position and the push amount. Further, when switching between modes, it is not necessary to perform a switching operation for switching between modes before performing a rotate operation or a pinch operation. Also, the operations assigned to each mode are not complicated. For this reason, each mode can be switched with good operability.
  • the touch input device of the present invention is preferably configured as follows.
  • the operation accepting unit accepts either a rotate operation or a pinch operation when a push-in operation is detected and the position changes.
  • the operation accepting unit accepts both the rotate operation and the pinch operation when the push-in operation is not detected and the position changes.
  • the touch input device of the present invention is preferably configured as follows.
  • the operation receiving unit calculates a change amount in the distance direction and a change amount in the rotation direction at each position when the push-in operation and the position are detected.
  • the operation accepting unit accepts a rotation operation when the change amount in the rotation direction is larger than the change amount in the distance direction, and accepts a pinch operation when the change amount in the distance direction is larger than the change amount in the rotation direction.
  • the indentation detection unit detects an indentation amount when the operation surface is indented.
  • the pushing operation can be detected by detecting the pushing amount.
  • the touch input device of the present invention preferably includes an image display unit that displays an image on a screen, and the screen is preferably an operation surface.
  • the indentation detection unit preferably has a piezoelectric film formed from a chiral polymer.
  • the chiral polymer is preferably polylactic acid.
  • the polylactic acid is preferably L-type polylactic acid.
  • the body temperature of the user's finger may be transmitted to the piezoelectric film and affect detection by the piezoelectric film.
  • polylactic acid does not have pyroelectricity, the amount of pressing by the piezoelectric film can be accurately detected.
  • a mode in which only image rotation is performed, a mode in which only enlargement / reduction is performed, and a mode in which rotation and enlargement / reduction are performed simultaneously can be switched with good operability.
  • FIG. 1 is an external perspective view of a touch input device according to an embodiment.
  • FIG. 2 is a cross-sectional view of the touch input device according to the present embodiment taken along line AA. It is a figure which shows the example of operation of the touch-type input device which concerns on this embodiment. It is a figure which shows the example of operation of the touch-type input device which concerns on this embodiment. It is a block diagram which shows the structure of the touch-type input device which concerns on this embodiment. It is a flowchart which shows the process which the operation
  • FIG. 1 is an external perspective view of the touch input device 10.
  • the touch input device 10 includes a touch panel 11 and a substantially rectangular parallelepiped casing 50.
  • the touch panel 11 has a substantially rectangular screen (operation surface) S1.
  • the screen S ⁇ b> 1 forms a part of the main surface (front surface) of the touch input device 10.
  • the screen S1 is a display unit that displays an image and an input unit that receives a user operation.
  • the touch panel 11 receives a user operation by detecting a touch position and a pressing amount on the screen S1.
  • an operation of touching the operation surface with a finger, a dedicated pen, etc., and an operation of moving the finger, the dedicated pen, etc. while making contact with the operation surface are referred to as a touch operation, and the position touched by the touch operation is referred to as a touch position.
  • a touch operation an operation of touching the operation surface with a finger, a dedicated pen, etc., and an operation of moving the finger, the dedicated pen, etc. while making contact with the operation surface
  • the position touched by the touch operation is referred to as a touch position.
  • Called An amount indicating how much the operation surface is pushed in with a finger, a dedicated pen or the like is referred to as a push-in amount.
  • FIG. 2 is a cross-sectional view of the touch input device 10 taken along the line AA.
  • the width direction (lateral direction) of the touch-type input device 10 will be described as the X direction
  • the length direction (vertical direction) as the Y direction
  • the thickness direction as the Z direction.
  • the case where the length of the touch input device 10 in the X direction is shorter than the length of the touch input device 10 in the Y direction is shown.
  • the touch input device 10 includes a push sensor unit 20, a position sensor unit 30, a display panel unit 40, and an arithmetic circuit module 60.
  • the indentation sensor unit 20, the position sensor unit 30, and the display panel unit 40 have a flat plate shape, and are arranged in the housing 50 so that their main surfaces are parallel to the screen S1.
  • the arithmetic circuit module 60 has a substantially flat plate shape and is disposed in the housing 50.
  • a protective cover 51 is provided on the main surface (front surface) of the touch input device 10.
  • the screen S1 corresponds to one main surface of the protective cover 51.
  • the display panel unit 40 includes a flat liquid crystal panel 401, a front polarizing plate 402, a back polarizing plate 403, and a backlight 404.
  • the driving electrode is applied from the outside, so that the alignment state of the liquid crystal changes so as to form a predetermined image pattern.
  • the front polarizing plate 402 and the back polarizing plate 403 are disposed so as to sandwich the liquid crystal panel 401.
  • the backlight 404 is disposed on the opposite side of the liquid crystal panel 401 with the back polarizing plate 403 interposed therebetween.
  • a push sensor unit 20 and a position sensor unit 30 are provided between the surface polarizing plate 402 and the protective cover 51.
  • the position sensor unit 30 is located on the protective cover 51 side, and the push sensor unit 20 is located on the surface polarizing plate 402 side.
  • the indentation sensor unit 20, the position sensor unit 30, and the display panel unit 40 are bonded by an adhesive 501.
  • the position sensor unit 30 includes a flat insulating substrate 301.
  • the insulating substrate 301 is made of a material that has translucency and does not have birefringence.
  • a plurality of electrodes 302 are formed on one plane of the insulating substrate 301.
  • a plurality of electrodes 303 are formed on the other plane.
  • the plurality of electrodes 302 are long and long in the Y direction, and are arranged at intervals along the X direction.
  • the plurality of electrodes 303 are long and long in the X direction, and are arranged at intervals along the Y direction. That is, the plurality of electrodes 302 and 303 are disposed so as to intersect at approximately 90 ° when viewed from the Z direction.
  • the plurality of electrodes 302 and 303 are made of a light-transmitting material.
  • Each of the electrodes 302 and 303 of the position sensor unit 30 forms a pair, and detects a change in capacitance when the user's finger approaches the screen S1.
  • the pair of electrodes 302 and 303 that has detected a change in capacitance outputs one of the electrodes 302 and 303 as a reference potential and outputs a capacitance detection signal corresponding to the change in capacitance.
  • the output electrostatic capacitance detection signal is input to the arithmetic circuit module 60 via a wiring (not shown).
  • the touch position can be detected from the combination of the electrodes 302 and 303 that have detected the capacitance change.
  • the indentation sensor unit 20 includes a flat film-like piezoelectric film 201. Electrodes 202 and 203 are formed on both flat plate surfaces of the piezoelectric film 201. The electrodes 202 and 203 are formed on substantially the entire flat plate surface of the piezoelectric film 201.
  • the piezoelectric film 201 is a film formed from a chiral polymer.
  • polylactic acid (PLA) particularly L-type polylactic acid (PLLA) is used as the chiral polymer.
  • PLLA is uniaxially stretched. Since the chiral polymer has higher transparency than PVDF, the image displayed on the display panel unit 40 is easily visible by forming the piezoelectric film 201 from the chiral polymer.
  • PLLA made of chiral polymer has a helical structure in the main chain.
  • PLLA has piezoelectricity when uniaxially stretched and molecules are oriented.
  • the uniaxially stretched PLLA generates electric charges when the flat plate surface of the piezoelectric film 201 is pressed.
  • the amount of electric charge generated depends on the amount of displacement when the flat plate surface is displaced in the direction orthogonal to the flat plate surface by pressing. That is, the amount of generated charge depends on the amount of pushing.
  • PLLA Since PLLA generates piezoelectricity by molecular orientation treatment such as stretching, it is not necessary to perform poling treatment like other polymers such as PVDF and piezoelectric ceramics. That is, the piezoelectricity of PLLA that does not belong to ferroelectrics is not expressed by the polarization of ions like ferroelectrics such as PVDF and PZT, but is derived from a helical structure that is a characteristic structure of molecules. is there. For this reason, the pyroelectricity generated in other ferroelectric piezoelectric materials does not occur in PLLA.
  • the electrodes 202 and 203 it is preferable to use any of inorganic electrodes such as ITO, ZnO, silver nanowires, carbon nanotubes, and graphene, and organic electrodes mainly composed of polythiophene, polyaniline, and the like. By using these materials, a highly translucent conductor pattern can be formed.
  • the electric charge generated by the piezoelectric film 201 can be acquired as a potential difference, and a push amount detection signal having a voltage value corresponding to the push amount can be output to the outside.
  • the push-in amount detection signal is output to the arithmetic circuit module 60 via a wiring (not shown).
  • the arithmetic circuit module 60 is arranged on the back side of the display panel unit 40. Specifically, a mounting board (not shown) is disposed in the space on the back side of the display panel unit 40 in the housing 50, and the arithmetic circuit module 60 is mounted on the mounting board. .
  • FIG. 3A is a diagram showing a rotation operation.
  • the rotation operation first, the screen S1 is touched with the thumb and forefinger. Next, the index finger is moved in the circumferential (rotation) direction on the screen S1 with the thumb as an axis. That is, in the rotate operation, the second touch position moves in the circumferential direction around the first touch position.
  • the image can be rotated by a rotation operation.
  • FIG. 3B shows a pinch operation.
  • the pinch operation first, the screen S1 is touched with the thumb and forefinger. Next, these fingers are moved on the screen S1 so as to change the distance between the thumb and the index finger. That is, in the pinch operation, the interval between the first touch position and the second touch position changes.
  • An image can be enlarged or reduced by a pinch operation.
  • FIG. 4A is a diagram showing a composite operation.
  • the rotation operation and the pinch operation are performed simultaneously. Specifically, first, the screen S1 is touched with the thumb and forefinger. Next, with the thumb as an axis, the index finger is moved in the circumferential direction, and at the same time, the index finger is moved in the radius (distance) direction. That is, in the composite operation, the second touch position moves in the circumferential direction and the radial direction around the first touch position. In a predetermined mode, image rotation and enlargement / reduction can be performed simultaneously by a composite operation. Details will be described later.
  • FIG. 4 (B) is a diagram showing a push-in operation. Press the screen with at least one finger touching the screen. That is, at least one of the touch positions is pushed in.
  • the display change mode can be switched by a push operation. Details will be described later.
  • FIG. 5 is a block diagram showing the configuration of the touch input device 10.
  • the touch input device 10 includes the push sensor unit 20, the position sensor unit 30, the display panel unit 40, and the arithmetic circuit module 60 as described above.
  • the push sensor unit 20 outputs a push amount detection signal as described above.
  • the position sensor unit 30 outputs a capacitance detection signal as described above.
  • the display panel unit 40 displays an image on the screen S1 (see FIG. 1).
  • the arithmetic circuit module 60 controls the push sensor unit 20, the position sensor unit 30, and the display panel unit 40, and changes the display of an image based on a user operation.
  • the arithmetic circuit module 60 includes an indentation processing unit 61, a position processing unit 62, an operation calculation unit 63, and a display processing unit 64. Note that physically, the arithmetic circuit module 60 includes a CPU, a RAM, a ROM, a bus connecting them, and the like.
  • the push-in processing unit 61 controls the push-in sensor unit 20 and detects the presence / absence of a push-in operation. Specifically, the following processing is performed.
  • a push amount detection signal is acquired from the push sensor unit 20. When the push amount detection signal is equal to or greater than the threshold value, it is determined that the push operation has been performed. When the pressing amount detection signal is less than the threshold value, it is determined that the pressing operation has not been performed.
  • the presence / absence of a push operation is transmitted to the motion calculation unit 63.
  • the indentation sensor unit 20 and the indentation processing unit 61 constitute an indentation detection unit of the present invention.
  • the position processing unit 62 controls the position sensor unit 30 to detect a touch position. Specifically, the following processing is performed. A capacitance detection signal is acquired from the position sensor unit 30. The touch position is calculated from the capacitance detection signal. The touch position is transmitted to the motion calculation unit 63 in response to a request from the motion calculation unit 63.
  • the position sensor unit 30 and the position processing unit 62 constitute a position detection unit of the present invention.
  • the motion calculation unit 63 determines that at least one of the rotation operation and the pinch operation has been performed based on the push operation and the change in the touch position. Then, the motion calculation unit 63 instructs the display processing unit 64 to change the image display based on the determination. That is, the motion calculation unit 63 accepts at least one of a rotate operation and a pinch operation based on a push-in operation and a change in touch position.
  • the motion calculation unit 63 corresponds to the operation reception unit of the present invention. Specifically, the motion calculation unit 63 performs, for example, the process illustrated in FIG. FIG. 6 is a flowchart illustrating processing performed by the motion calculation unit 63.
  • the touch position is acquired from the position processing unit 62 (S101).
  • the touch positions are stored (S103).
  • the touch position is acquired again from the position sensor unit 30 after the unit time (S101).
  • processing S105 is performed.
  • the touch position is acquired again from the position sensor unit 30 after the unit time (S101).
  • the two touch positions A0 and B0 before the unit time and the current two touch positions A1 and B1 are acquired.
  • the touch position A0 moves to the touch position A1 after a unit time
  • the touch position B0 moves to the touch position B1 after the unit time.
  • process S105 the presence or absence of a push operation is acquired from the push processing unit 61.
  • the process S107 is performed, and when the pushing operation is not detected (S106: No), the process S113 is performed.
  • the processes S107 to S112 as will be described later, one of the rotate operation and the pinch operation is accepted.
  • processes S113 and S114 both a rotate operation and a pinch operation are accepted as will be described later.
  • the movement amount ⁇ r of the rotation operation and the movement amount ⁇ p of the pinch operation will be described.
  • the movement amount ⁇ r is related to the movement distance when the other touch position moves in the circumferential direction around the one touch position.
  • the movement amount ⁇ p is related to the amount of change in the distance between touch positions.
  • the movement amounts ⁇ r and ⁇ p are arbitrarily set according to the operation required for the touch input device. Further, in accordance with the set amount of movement eta r, angle ⁇ and the reference point when rotating the image it is set. Depending on the setting of the amount of movement eta p, magnification ⁇ and the reference point of time of scaled images is set.
  • the movement amount ⁇ r corresponds to the “change amount in the rotation direction” of the present invention
  • the movement amount ⁇ p corresponds to the “change amount in the distance direction” of the present invention.
  • the movement amounts ⁇ r and ⁇ p may be set as follows.
  • a difference between the distance r A0B0 and the distance r A1B1 is defined as a movement amount ⁇ p .
  • the distance r ij represents the distance between the touch position i and the touch position j, as shown in FIG.
  • FIG. 7 is a diagram illustrating the distance between the touch positions.
  • the ratio between the distance r A0B0 and the distance r A1B1 is set as an enlargement / reduction ratio ⁇ .
  • the touch position A0 or B0 is set as a reference point.
  • the touch position A0 or B0 is set as a reference point.
  • the image is rotated or enlarged / reduced, and then the image is translated by a distance r B1B0 in the direction from the touch position B0 toward the touch position B1. .
  • the touch position A0 is selected as the reference point and r A1A0 ⁇ 0.
  • the image can follow the movement of the touch position.
  • movement amounts ⁇ r and ⁇ p are calculated based on the touch positions A0, B0, A1, and B1.
  • the movement amount ⁇ p is larger than the movement amount ⁇ r (S107: Yes)
  • the scaling factor ⁇ and the reference point are calculated (S109).
  • the motion calculation unit 63 instructs the display processing unit 64 to enlarge or reduce the image, and sends the magnification ⁇ and the reference point to the display processing unit 64 (S110). That is, when a pushing operation is detected and the movement amount ⁇ p is larger than the movement amount ⁇ r , only a pinch operation is accepted.
  • the motion calculation unit 63 instructs the display processing unit 64 to rotate the image, and sends the angle ⁇ and the reference point to the display processing unit 64 (S112). That is, when a pushing operation is detected and the movement amount ⁇ r is larger than the movement amount ⁇ p , only a rotation operation is accepted.
  • the motion calculation unit 63 determines that neither the pinch operation nor the rotation operation has been performed, and ends the process.
  • the magnification ⁇ , the angle ⁇ , and the reference point are calculated.
  • the motion calculation unit 63 instructs the display processing unit 64 to enlarge and reduce and rotate the image, and sends the magnification ⁇ , the angle ⁇ , and the reference point to the display processing unit 64 (S114). That is, when the pushing operation is not detected, both the rotate operation and the pinch operation are accepted.
  • the motion calculation unit 63 starts the processing S101 again after a unit time except when an interrupt is received. Thereby, the display of an image can be continuously changed according to a user's operation.
  • the display processing unit 64 controls the display panel unit 40 and displays an image on the display panel unit 40, that is, the screen S1. Further, the display processing unit 64 changes the display of the image based on an instruction from the motion calculation unit 63. When the display processing unit 64 is instructed to rotate the image, the display processing unit 64 displays the image rotated by the angle ⁇ about the reference point on the display panel unit 40. When the display processing unit 64 is instructed to enlarge or reduce the image, the display processing unit 64 displays the image enlarged or reduced by the magnification ⁇ around the reference point on the display panel unit 40.
  • the display panel unit 40 and the display processing unit 64 correspond to the image display unit of the present invention.
  • the user performs a rotation operation with the two fingers while pressing the screen S1 (see FIG. 1) with at least one of the two fingers.
  • the interval between the two fingers slightly changes due to the axis finger deviating from the center of the arc.
  • the process of rotating the image is performed, and the rotated image is displayed on the screen S1.
  • the user performs a pinch operation with the two fingers while pressing the screen S1 with at least one of the two fingers.
  • the other finger moves slightly in the circumferential direction with one finger as an axis. In this case, only the process of enlarging or reducing the image is performed, and the enlarged or reduced image is displayed on the screen S1.
  • the user performs a composite operation without pressing the screen S1.
  • both rotation and enlargement / reduction of the image are performed, and the rotated and enlarged / reduced image is displayed on the screen S1.
  • the user performs a composite operation while pressing the screen S1
  • only one of the image rotation and enlargement / reduction processing is performed according to the movement amount as described above.
  • Each mode is assigned a combination of push-in operation and touch operation. Each mode is switched by performing an operation assigned to each mode. For this reason, when switching the mode, it is not necessary to perform a switching operation for switching the mode before performing the rotate operation or the pinch operation. Further, since it is not necessary to assign a touch operation different from the rotate operation and the pinch operation to each mode, the operation assigned to each mode does not become complicated.
  • the touch input device includes a touch panel as an input / output unit, but the present invention is not limited to this.
  • the touch-type input device of the present invention may have a configuration in which an input unit and an output unit are separate, such as a display and a touchpad.
  • a mode in which only one of image rotation and enlargement / reduction is performed is selected, but the present invention is not limited to this.
  • a mode in which rotation and enlargement / reduction are simultaneously performed may be selected.
  • a mode in which only one of rotation and enlargement / reduction is performed may be selected.
  • S1 screen (operation surface) DESCRIPTION OF SYMBOLS 10 ... Touch-type input device 11 ... Touch panel 20 ... Push sensor part (push detection part) 30 ... Position sensor part (position detection part) 40.
  • Display processing unit (image display unit) 201 ... Piezoelectric films 202, 203, 301, 303 ... Electrode 301 ... Insulating substrate 401 ... Liquid crystal panel 402 ... Front polarizing plate 403 ... Back polarizing plate 404 ... Backlight 501 ... Adhesive

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

Le dispositif d'entrée tactile (10) d'après la présente invention comprend une unité de capteur de pression (20), une unité de capteur de position (30), une unité d'écran d'affichage (40) et un module de circuit de calcul (60). Le module de circuit de calcul (60) comprend une unité de traitement de pression (61), une unité de traitement de position (62), une unité de calcul d'action (63) et une unité de traitement d'affichage (64). L'unité de capteur de pression (20) et l'unité de traitement de pression (61) détectent des opérations de commande de pression effectuées sur une surface de commande. L'unité de capteur de position (30) et l'unité de traitement de position (62) détectent une pluralité de positions sur la surface de commande sur laquelle des opérations de commande tactile ont été effectuées. L'unité de calcul d'action (63) accepte une opération de commande de rotation et/ou une opération de commande de rétrécissement sur la base des opérations de commande de pression susmentionnées et des variations des positions susmentionnées.
PCT/JP2014/078691 2013-11-05 2014-10-29 Dispositif d'entrée tactile Ceased WO2015068615A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2015546612A JP5971430B2 (ja) 2013-11-05 2014-10-29 タッチ式入力装置

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2013228986 2013-11-05
JP2013-228986 2013-11-05

Publications (1)

Publication Number Publication Date
WO2015068615A1 true WO2015068615A1 (fr) 2015-05-14

Family

ID=53041397

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/078691 Ceased WO2015068615A1 (fr) 2013-11-05 2014-10-29 Dispositif d'entrée tactile

Country Status (2)

Country Link
JP (1) JP5971430B2 (fr)
WO (1) WO2015068615A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPWO2017026364A1 (ja) * 2015-08-07 2018-07-12 株式会社村田製作所 表示装置

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001290585A (ja) * 2000-01-31 2001-10-19 Canon Inc 位置情報処理装置及びその方法及びそのプログラム、操作装置及びその方法及びそのプログラム
JP2006034754A (ja) * 2004-07-29 2006-02-09 Nintendo Co Ltd タッチパネルを用いたゲーム装置およびゲームプログラム
JP2011134316A (ja) * 2009-11-26 2011-07-07 Asahi Kasei Electronics Co Ltd タッチパネル装置及びタッチパネルのタッチ入力点間距離検出方法
JP2012043266A (ja) * 2010-08-20 2012-03-01 Sony Corp 情報処理装置、プログラム及び表示制御方法
JP2012511191A (ja) * 2008-10-28 2012-05-17 サーク・コーポレーション 多重接触エリア回転ジェスチャ認識方法
JP2012517061A (ja) * 2009-02-04 2012-07-26 キーレス システムズ リミテッド データ入力システム
WO2013021835A1 (fr) * 2011-08-11 2013-02-14 株式会社村田製作所 Panneau tactile

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001290585A (ja) * 2000-01-31 2001-10-19 Canon Inc 位置情報処理装置及びその方法及びそのプログラム、操作装置及びその方法及びそのプログラム
JP2006034754A (ja) * 2004-07-29 2006-02-09 Nintendo Co Ltd タッチパネルを用いたゲーム装置およびゲームプログラム
JP2012511191A (ja) * 2008-10-28 2012-05-17 サーク・コーポレーション 多重接触エリア回転ジェスチャ認識方法
JP2012517061A (ja) * 2009-02-04 2012-07-26 キーレス システムズ リミテッド データ入力システム
JP2011134316A (ja) * 2009-11-26 2011-07-07 Asahi Kasei Electronics Co Ltd タッチパネル装置及びタッチパネルのタッチ入力点間距離検出方法
JP2012043266A (ja) * 2010-08-20 2012-03-01 Sony Corp 情報処理装置、プログラム及び表示制御方法
WO2013021835A1 (fr) * 2011-08-11 2013-02-14 株式会社村田製作所 Panneau tactile

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPWO2017026364A1 (ja) * 2015-08-07 2018-07-12 株式会社村田製作所 表示装置

Also Published As

Publication number Publication date
JP5971430B2 (ja) 2016-08-17
JPWO2015068615A1 (ja) 2017-03-09

Similar Documents

Publication Publication Date Title
JP6065950B2 (ja) タッチセンサ
WO2015046289A1 (fr) Dispositif à entrée tactile
WO2015068709A1 (fr) Dispositif et programme d'affichage
US10007386B2 (en) Input device and program
JP6292344B2 (ja) タッチ式入力装置
CN104919406B (zh) 触摸式输入装置
JP6237890B2 (ja) 表示装置及びプログラム
US11307702B2 (en) Operation detection device and display device
JP6037046B2 (ja) タッチ式入力装置及び携帯型表示装置
JP6015866B2 (ja) 携帯端末用表示装置
JP5971430B2 (ja) タッチ式入力装置
JP5954502B2 (ja) 入力装置及びプログラム
WO2015064488A1 (fr) Dispositif de saisie tactile
JP5983892B2 (ja) 表示装置
JP6540905B2 (ja) 押圧検出装置
JP6079895B2 (ja) タッチ式入力装置
WO2015068619A1 (fr) Dispositif d'affichage

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14860974

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2015546612

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14860974

Country of ref document: EP

Kind code of ref document: A1