WO2009035184A1 - Procédé pour implémenter un pavé tactile à l'aide d'un capteur tactile - Google Patents
Procédé pour implémenter un pavé tactile à l'aide d'un capteur tactile Download PDFInfo
- Publication number
- WO2009035184A1 WO2009035184A1 PCT/KR2007/005672 KR2007005672W WO2009035184A1 WO 2009035184 A1 WO2009035184 A1 WO 2009035184A1 KR 2007005672 W KR2007005672 W KR 2007005672W WO 2009035184 A1 WO2009035184 A1 WO 2009035184A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- tactile sensor
- center point
- touching
- force
- mouse cursor
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0414—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using force sensing means to determine a position
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03547—Touch pads, in which fingers can move on a surface
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0414—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using force sensing means to determine a position
- G06F3/04144—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using force sensing means to determine a position using an array of force sensing means
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
Definitions
- the present invention relates to a method for implementing a touch pad that uses a tactile sensor including multiple force sensors to provide a mouse function to allow free X and Y-direction movements and rotations of a cursor on a screen and a character input and font face change function so that the touch pad can be used as an interface device for mobile appliances that tend to be slim.
- Computer systems employ various types of input units to perform input operations. These operations generally include cursor movement and selection on a display screen and provide functions such as page turning, scrolling, panning, and zooming.
- Well-known input units include a button, a switch, a mouse, a trackball, etc.
- the button and switch are generally mechanical so that they are limited in their control to move the cursor or make selections.
- the button or switch provides only a function to move the cursor in a specific direction using a key such as an arrow direction key or to make a specific selection using a key such as an enter key, delete key or number key.
- an input pointer is moved according to the relative movement of the mouse.
- the input pointer is moved according to the relative movement of the trackball.
- Text messages which users of mobile appliances send using keypads, generally contain letters in a uniform font face without incorporating their emotion.
- a touch pad which has both a mouse function, which allows users to freely move the cursor through force-based detection of touch states, and an input device function which allows users to input letters and to change the font face.
- the present invention has been made in view of the above problems, and it is an object of the present invention to provide a method for implementing a touch pad that uses a tactile sensor including multiple force sensors to provide a mouse function to allow free X and Y-direction movements and rotations of a cursor on a screen and a character input and font face change function so that the touch pad can be used as an interface device for mobile appliances that tend to be slim.
- the above and other objects can be accomplished by the provision of a method for implementing a touch pad algorithm for mobile appliances to process a touch input using a tactile sensor including multiple force sensors, wherein the touch pad algorithm performs a mouse function which calculates a center point and magnitude of touching force of the tactile sensor and uses a moving distance and direction of the mouse cursor obtained based on the calculated center point and a character input function which uses touching of the tactile sensor.
- the implementation of the algorithm for the moving distance and direction of a mouse cursor using the center point of force in the method for implementing the touch pad using the tactile sensor includes the steps of calculating a center point (X 1 C , Y 1 C ) of a touched area when touching of the tactile sensor is detected after a specific time (i) elapses, terminating the algorithm when touching of the tactile sensor is not detected after a specific time (i+1) elapses and calculating a center point (X 1+1 C/ Y 1+1 c) of a touched area when touching of the tactile sensor is detected after the specific time (i+1) elapses, and calculating a moving distance and direction of the mouse cursor using the calculated center points (X 1 C , Y X c) and (X 1+1 C Y i+1 c ) of the touched areas.
- direction of the mouse cursor may be based on ,
- the implementation of an algorithm for the moving distance and direction of a mouse cursor using the center point of force in the method for implementing the touch pad using the tactile sensor includes the steps of calculating a center point (X 1 C , Y 1 C ) of a touched area of the tactile sensor after a specific time (i) elapses, calculating a center point ( X 1 , y t ) of the touched area changed based on momentum in consideration of the calculated center point of the touched area and a distribution of the force, terminating the algorithm when touching of the tactile sensor is not detected after a specific time (i+1) elapses and calculating a center point (X 1+1 C , Y 1+1 C ) of a touched area of the tactile sensor when touching of the tactile sensor is detected after the specific time (i+1) elapses, calculating a center point ⁇ x M ,y M ) of the touched area changed based on momentum in consideration of the calculated center point of the
- touching can be detected as a click if a detected force magnitude corresponds to an impulse signal or if a Z-axis magnitude detected through each sensor is equal to or greater than a reference level.
- a click detection region and a scroll region are set in the tactile sensor such that, when touching of the click detection region is detected, a corresponding file is opened or closed and, when touching of the scroll region is detected, it is determined whether or not subsequent touching has occurred and a scroll function is then performed according to the determination .
- the character input function using touching of the tactile sensor provides a function to change the thickness or color of a character according to the magnitude of force pressing the tactile sensor and to change a font face according to the speed of movement in contact with the tactile sensor.
- FIG. 1 illustrates the concept of a touch pad using a tactile sensor according to the invention when it is mounted on a mobile device to allow the user to freely move a cursor as if it is a conventional PC mouse;
- FIG. 2 is a flow chart illustrating sequential processes of a method for implementing a touch pad using a tactile sensor according to an aspect of the invention
- FIGS. 4 and 5 are graphs showing the relation of a cursor moving distance to the magnitude of force in a method for implementing a touch pad using a tactile sensor according to an aspect of the invention
- FIG. 6 is a flow chart illustrating sequential processes of a method for implementing a touch pad using a tactile sensor according to another aspect of the invention.
- FIGS. 7 and 8 illustrate the concept of a modification of the method for implementing a touch pad using a tactile sensor according to another aspect of the invention
- FIGS. 9 and 10 are graphs showing the relation of a cursor moving distance to the magnitude of force in a method for implementing a touch pad using a tactile sensor according to an aspect of the invention
- FIGS. 11 and 12 illustrate the concept of modifications of the method for implementing a touch pad using a tactile sensor according to various aspects of the invention
- FIG. 13 illustrates an example application of the method for implementing a touch pad using a tactile sensor according to the invention
- FIG. 14 illustrates a photograph of a manufactured touch pad applied to a method for implementing a touch pad algorithm using a tactile sensor according to the invention.
- a method for implementing a touch pad using a tactile sensor provides a method for implementing an algorithm that processes a touch input using a tactile sensor including multiple force sensors. This method is to calculate the center point and magnitude of touching force of the tactile sensor and to implement an algorithm for the moving distance and direction of a mouse cursor using the calculated center point in order to provide a mouse function and to implement a touch input information algorithm that provides a character input function through the touching trace and force of the tactile sensor.
- FIG. 1 illustrates the concept of a touch pad using a tactile sensor according to the invention when it is mounted on a mobile device to allow the user to freely move a cursor as if it is a conventional PC mouse.
- the tactile sensor-based touch pad according to the invention can perform character recognition and transmission of a font face containing the user's emotion using the touching trace and force.
- FIG. 2 is a flow chart illustrating sequential processes of a method for implementing a touch pad using a tactile sensor according to an aspect of the invention
- the implementation of the algorithm for the moving distance and direction of a mouse cursor using the center point of force in the method for implementing the touch pad using the tactile sensor includes the steps of calculating a center point (X 1 C, Y 1 C) of a touched area when touching of the tactile sensor is detected after a specific time (i) elapses, terminating the algorithm when touching of the tactile sensor is not detected after a specific time (i+1) elapses and calculating a center point (X 1+1 C , Y 1+1 J of a touched area when touching of the tactile sensor is detected after the specific time (i+1) elapses, and calculating a moving distance and direction of the mouse cursor using the calculated center points (X 1 C , Y 1 C ) and (X i+1 c , Y 1+1 C ) of the touched areas.
- the moving distance and direction of the mouse cursor uses the calculated center points (X 1 C , Y 1 C ) and (X i+1 c , Y 1+1 C ) of the
- ®XH ⁇ direction of the mouse cursor may be based on ,
- the moving distance of the mouse cursor on the screen based on the relation of the moving distance of the cursor to the magnitude of the force
- as shown in FIG. 5 it is possible not only to quickly move the mouse cursor on the screen but also to finely move the mouse cursor, thereby achieving smooth movement of the mouse cursor.
- FIG. 6 is a flow chart illustrating sequential processes of a method for implementing a touch pad using a tactile sensor according to another aspect of the invention.
- the implementation of an algorithm for the moving distance and direction of a mouse cursor using the center point of force in the method for implementing the touch pad using the tactile sensor according to this aspect of the invention includes the steps of calculating a center point (X 1 C , Y 1 C) of a touched area of the tactile sensor after a specific time (i) elapses, calculating a center point ( x,,y, ) of the touched area changed based on momentum in consideration of the calculated center point of the touched area and a distribution of the force, terminating the algorithm when touching of the tactile sensor is not detected after a specific time (i+1) elapses and calculating a center point (X 1+1 C , Y 1+1 c ) of a touched area of the tactile sensor when touching of the tactile sensor is detected after the specific time (i+1) elapses, calculating a center point
- the touch center point is calculated using
- FIG. 8 the momentum-based center point in consideration of the
- the moving distance of the mouse cursor can be obtained from the relation of the moving distance of the mouse cursor to the magnitude of each coordinate in consideration of momentum as shown in FIG. 9.
- using the relation of the moving speed of the cursor in units of pixels to the magnitude of the force of the coordinates in consideration of momentum as shown in FIG. 10 it is possible not only to quickly move the mouse cursor on the screen but also to finely move the mouse cursor since movement between pixels is detected, thereby achieving smooth movement of the mouse cursor.
- touching is detected as a click if the magnitude of force detected by the touching corresponds to an impulse signal or if a Z-axis magnitude detected through each sensor is equal to or greater than a reference level.
- a click detection region and a scroll region may also be set in the interface device using the tactile sensor as shown in FIG. 12 such that, when touching of the click detection region is detected, a corresponding file is opened or closed and, when touching of the scroll region is detected, it is determined whether or not subsequent touching has occurred and a scroll function is performed according to the corresponding moving direction of the mouse cursor.
- character recognition can be performed based on the touching trace and the font face can be changed based on both the magnitude of force pressing the touch pad including a tactile sensor and the speed of movement in contact with the touch pad as shown in FIG. 13.
- the thickness or color of each character can be changed according to the magnitude of force pressing the touch pad including the tactile sensor and the font face can be changed according to the speed of movement in contact with the touch pad.
- FIG. 14 illustrates a manufactured touch pad for mobile appliances using a tactile sensor with a 10x10 force sensor array.
- the present invention implements a touch pad for mobile appliances that uses a tactile sensor including multiple force sensors to detect force applied to the interface device to provide a mouse function to allow free X and Y-direction movements and rotations of a cursor on a screen and a character input and font face change function.
- the touch pad according to the invention can be used as an interface device for slim mobile appliances such as mobile phones and thus can replace an existing mouse or joystick with the tactile sensor so that it can be applied to a GUI environment .
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Position Input By Displaying (AREA)
Abstract
La présente invention concerne un procédé pour implémenter un pavé tactile. Le procédé utilise un capteur tactile comprenant plusieurs capteurs de force pour fournir une fonction de souris et permettre des mouvements libres dans les directions X et Y et des rotations d'un curseur sur un écran, ainsi que la saisie de caractères et une fonction de changement d'œil de caractère. Le procédé pour implémenter le pavé tactile prévoit un procédé pour implémenter un algorithme qui traite une saisie tactile à l'aide d'un capteur tactile comprenant plusieurs capteurs de force. Ce procédé a pour objet d'implémenter un algorithme d'information de saisie tactile qui calcule le point central et la magnitude de la force tactile du capteur tactile pour fournir une fonction de souris à l'aide de la distance et de la direction de déplacement d'un curseur calculées à partir du point central et pour fournir une fonction de saisie de caractères à l'aide de pression sur le capteur tactile et une fonction pour changer l'œil de caractère en détectant la force de pression de l'utilisateur.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| KR1020070093864A KR100936046B1 (ko) | 2007-09-14 | 2007-09-14 | 촉각 센서를 이용한 마우스 기능을 갖는 터치패드 구현방법 |
| KR10-2007-0093864 | 2007-09-14 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2009035184A1 true WO2009035184A1 (fr) | 2009-03-19 |
Family
ID=40452169
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/KR2007/005672 Ceased WO2009035184A1 (fr) | 2007-09-14 | 2007-11-12 | Procédé pour implémenter un pavé tactile à l'aide d'un capteur tactile |
Country Status (2)
| Country | Link |
|---|---|
| KR (1) | KR100936046B1 (fr) |
| WO (1) | WO2009035184A1 (fr) |
Cited By (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2011025845A1 (fr) * | 2009-08-27 | 2011-03-03 | Symbol Technologies, Inc. | Procédés et appareil de manipulation à base de pression d'un contenu figurant sur un écran tactile |
| WO2012078654A1 (fr) * | 2010-12-07 | 2012-06-14 | Google Inc. | Opération d'édition reposant sur des indices physiques employant la force |
| US8963874B2 (en) | 2010-07-31 | 2015-02-24 | Symbol Technologies, Inc. | Touch screen rendering system and method of operation thereof |
| US8988191B2 (en) | 2009-08-27 | 2015-03-24 | Symbol Technologies, Inc. | Systems and methods for pressure-based authentication of an input on a touch screen |
| US9018030B2 (en) | 2008-03-20 | 2015-04-28 | Symbol Technologies, Inc. | Transparent force sensor and method of fabrication |
Families Citing this family (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US9501098B2 (en) | 2011-09-19 | 2016-11-22 | Samsung Electronics Co., Ltd. | Interface controlling apparatus and method using force |
| US9519350B2 (en) | 2011-09-19 | 2016-12-13 | Samsung Electronics Co., Ltd. | Interface controlling apparatus and method using force |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6388655B1 (en) * | 1999-11-08 | 2002-05-14 | Wing-Keung Leung | Method of touch control of an input device and such a device |
| US20020093491A1 (en) * | 1992-06-08 | 2002-07-18 | David W. Gillespie | Object position detector with edge motion feature and gesture recognition |
| KR20060084945A (ko) * | 2005-01-21 | 2006-07-26 | 엘지전자 주식회사 | 터치 패드 기반의 붓 효과 생성방법 |
| KR20070079858A (ko) * | 2006-02-03 | 2007-08-08 | 유주영 | 터치패드를 이용한 드래그 기능 구현 방법 |
Family Cites Families (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US7663607B2 (en) * | 2004-05-06 | 2010-02-16 | Apple Inc. | Multipoint touchscreen |
-
2007
- 2007-09-14 KR KR1020070093864A patent/KR100936046B1/ko not_active Expired - Fee Related
- 2007-11-12 WO PCT/KR2007/005672 patent/WO2009035184A1/fr not_active Ceased
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20020093491A1 (en) * | 1992-06-08 | 2002-07-18 | David W. Gillespie | Object position detector with edge motion feature and gesture recognition |
| US6388655B1 (en) * | 1999-11-08 | 2002-05-14 | Wing-Keung Leung | Method of touch control of an input device and such a device |
| KR20060084945A (ko) * | 2005-01-21 | 2006-07-26 | 엘지전자 주식회사 | 터치 패드 기반의 붓 효과 생성방법 |
| KR20070079858A (ko) * | 2006-02-03 | 2007-08-08 | 유주영 | 터치패드를 이용한 드래그 기능 구현 방법 |
Cited By (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US9018030B2 (en) | 2008-03-20 | 2015-04-28 | Symbol Technologies, Inc. | Transparent force sensor and method of fabrication |
| WO2011025845A1 (fr) * | 2009-08-27 | 2011-03-03 | Symbol Technologies, Inc. | Procédés et appareil de manipulation à base de pression d'un contenu figurant sur un écran tactile |
| US8363020B2 (en) | 2009-08-27 | 2013-01-29 | Symbol Technologies, Inc. | Methods and apparatus for pressure-based manipulation of content on a touch screen |
| US8988191B2 (en) | 2009-08-27 | 2015-03-24 | Symbol Technologies, Inc. | Systems and methods for pressure-based authentication of an input on a touch screen |
| US8963874B2 (en) | 2010-07-31 | 2015-02-24 | Symbol Technologies, Inc. | Touch screen rendering system and method of operation thereof |
| WO2012078654A1 (fr) * | 2010-12-07 | 2012-06-14 | Google Inc. | Opération d'édition reposant sur des indices physiques employant la force |
Also Published As
| Publication number | Publication date |
|---|---|
| KR100936046B1 (ko) | 2010-01-08 |
| KR20090028344A (ko) | 2009-03-18 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| KR100950234B1 (ko) | 압력 센서를 이용한 마우스 알고리즘 구현 방법 | |
| US9223411B2 (en) | User interface with parallax animation | |
| EP1840713A2 (fr) | Dispositif d'entrée avec molette et méthode de saisie à quatre voies dans un terminal portable | |
| US8629837B2 (en) | Method and device for controlling information display output and input device | |
| EP1873618A2 (fr) | Procédé d'interface utilisateur à clavier tactile et terminal mobile l'utilisant | |
| US20070298849A1 (en) | Keypad touch user interface method and a mobile terminal using the same | |
| WO2009035184A1 (fr) | Procédé pour implémenter un pavé tactile à l'aide d'un capteur tactile | |
| WO2009002787A2 (fr) | Gestes d'effleurement pour des claviers d'écran tactile | |
| SI20774A (sl) | 3D senzitivna ploščica | |
| GB2510333A (en) | Emulating pressure sensitivity on multi-touch devices | |
| EP1727028B1 (fr) | Dispositif de commande à double position et procédé pour contrôler une indication sur un écran d'un dispositif électronique | |
| US10126843B2 (en) | Touch control method and electronic device | |
| TWI413916B (zh) | 觸摸感測器軌跡點以及方法 | |
| EP3008556B1 (fr) | Desambiguisation d'une entree indirecte | |
| JP5524937B2 (ja) | タッチパッドを含む入力装置および携帯式コンピュータ | |
| US9285836B2 (en) | Portable electronic device including touch-sensitive display | |
| CN101458585A (zh) | 触控板的检测方法 | |
| JP2012141650A (ja) | 携帯端末 | |
| CN104423657A (zh) | 信息处理的方法及电子设备 | |
| CA2761454C (fr) | Dispositif electronique portatif a affichage tactile | |
| EP2407866B1 (fr) | Dispositif électronique portable et procédé pour déterminer l'emplacement d'un effleurement | |
| KR101844651B1 (ko) | 모바일 클라우드 컴퓨팅 클라이언트 환경에서 3d 터치를 이용한 모바일 기기의 마우스 입력장치 및 입력방법 | |
| US9720513B2 (en) | Apparatus and method for receiving a key input | |
| JP5992380B2 (ja) | ポインティング・デバイス、ノートブック型パーソナル・コンピュータおよび操作方法。 | |
| CN113110792B (zh) | 一种实现复制粘贴的手势操作方法及装置 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 07833979 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 07833979 Country of ref document: EP Kind code of ref document: A1 |