CN101699387B - 非触摸式交互系统及方法 - Google Patents
非触摸式交互系统及方法 Download PDFInfo
- Publication number
- CN101699387B CN101699387B CN200910163965.1A CN200910163965A CN101699387B CN 101699387 B CN101699387 B CN 101699387B CN 200910163965 A CN200910163965 A CN 200910163965A CN 101699387 B CN101699387 B CN 101699387B
- Authority
- CN
- China
- Prior art keywords
- display
- control circuit
- contact
- sensing region
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/044—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04101—2.5D-digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface and also measures the distance of the input means within a short range in the Z direction, possibly with a separate measurement setup
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04805—Virtual magnifying lens, i.e. window or frame movable on top of displayed information to enlarge it for better reading or selection
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04806—Zoom, i.e. interaction techniques or interactors for controlling the zooming operation
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
Abstract
本发明涉及非接触式交互系统及方法。一种非接触式显示系统,使得用户能通过将手指或指示器移向图像的所选择部分来与显示的图像交互。可以响应检测到的移动动态地放大或转化图像。可以将操作方法在接触型和非接触操作之间手动转换以提高灵活性。
Description
技术领域
本发明涉及导航(navigate)小屏幕显示器的系统及其方法。更具体地,本发明涉及具有用于追踪用户手指指向虚拟键盘的轨迹的非接触式传感器的该系统及方法。
背景技术
可以在例如手机、个人数字助理(PDA)、移动电脑和成像器的产品上见到各种类型的小屏幕显示器。越来越多地,用户必须导航小屏幕显示器去利用手机进行网页浏览或使用PDA浏览照片。
小型触摸屏用于支持与运行在诸如PDA和移动电话的便携装置上的程序的交互是常见的。小型触摸屏也在寻找到家用产品的路径,例如Honeywell公司的TH8321U1006型恒温器,Honeywell公司的6271V型安全面板,以及各种个人健康监视装置。它们已经在包裹递送、零售仓储和精炼厂野外操作中应用多年。
已知的,导航系统使用缩放和全景控制或通过鱼眼观视器。然而,在一些情况下使用这些控制是不方便和低效率的。例如,很难结合触摸屏去使用鱼眼导航。同样很难利用鼠标或触摸屏去缩放和全景图形。在小型屏上使用触针的导航可能对用户更加困难。
因此对于另外一种自然的并且更容易使用的控制导航图形大小比例的装置的需求是持续地。因此存在对于自然且容易使用的控制导航大尺度和小尺度图形的替换方法的持续需求。
同样可期待的是,能够在导航诸如地图或建筑楼层平面图的大尺度图形中使用这种方法。
发明内容
一种显示系统,其包括:图像可以在其表面呈现的多维显示装置;位于该表面附近的多个非接触式传感器;以及耦合到该传感器和该显示装置的控制电路,该电路响应来自传感器的信号,确定向该表面的区域移动的指示部件的轨迹。
附图说明
附图1为根据本发明的非触摸式交互系统的图示;
附图2为附图1中的系统的部分软件元件的框图;
附图3为图示交互方法的流程图;
附图4A、4B示出了附图1中系统的两种不同的应用;
附图5为非触摸模式输入屏幕,包括建筑名字,街道号码和地址,城市,邮政编码,第一警报,楼层号,检测器;
附图6为中间非触摸模式屏幕,包括建筑名字,街道号码和地址,城市,邮政编码,第一警报,楼层号,检测器;
附图7为具有退出按钮的非触摸模式区域显示屏,包括建筑名字,街道号码和地址,城市,邮政编码,第一警报,楼层号,检测器;
附图8A、8B、8C示出了附图1中系统的非接触式传感器的各个方面;
附图9A、9B、9C示出了虚拟键盘非触摸式导航的一种形式,在图9A中,当手指足够靠近触摸屏时,将根据手指的位置选择和放大可视键盘的一个框;以及
附图10A、10B、10C示出了虚拟键盘非触摸式导航的另一种形式,在图10A中,将根据手指的位置连续放大虚拟键盘。
具体实施方式
尽管本发明的实施例可以采取许多不同形式,在附图中示出其具体实施例并且结合下列理解在本文中详细描述该具体实施例:将本发明的公开作为本发明原理的例证,以及某些最佳实施方式,并且并不意为将本发明限制于所示的具体实施例。
本发明的实施例包括非触摸或非接触界面,其在三维中感测用户的手指或手的位置。在公开的实施例中,可以在显示装置的边缘布置多个电容性传感器。可以追踪指向显示在装置上的虚拟键盘上的点的手指或手的轨迹。这使得相关系统在手指实际触摸到屏幕前,能预测用户试图选择的显示装置屏幕上的点。
根据本发明可以使用Z轴手指位置数据以在例如地图显示器上控制缩放比或缩范围。可选择地,可通过利用这种非触摸指示方法来控制地图显示器上的鱼眼。鱼眼的多个参数,例如缩放比、缩放范围、缩放形状(矩形、圆角矩形、椭圆形等等)以及鱼眼周围失真边缘的比例可以在处理中修改。
当用户移动他/她的手指,显示器中的图形内容相应地更新。手指的移动同样可以控制地图显示器上的缩放/跨越操作或鱼眼效果。这些处理对于用户是高效率和直观的。
相同的方法也可以用于控制小屏幕显示器上的虚拟键盘并且与小屏幕显示器上的虚拟键盘相交互。它克服了与小虚拟键盘相关的长期性问题即按键总是比人指尖小得多。
精确的交互需要仅仅键盘的目标区域的放大(例如一些按键的小的子集)。非触摸界面使用与手的位置相关的z轴数据来推断键盘上需要的目标区域并且自动缩放或放大虚拟键盘上需要的区域。
在一些应用中,来自非触摸装置的输入信号可能会扰乱不需要为非触摸的交互。在本发明的一个方面中,多个不同的方法可以用于直观地和迅速地停止/启动非触摸交互。在一个实施例中,用户的右手可以用于指示和控制缩放控制或鱼眼控制,左手可以用于操作启动或停止非触摸导航的按钮。在这种处理中,左手还可以用于在快速移动(fly)中迅速改变鱼眼或缩放参数,同时右手做指示和拖拉动作来提供非常有效的双手交互。
附图1示出了一个非触摸或非接触式交互系统100。该系统100包括通过数据总线152与触摸屏输入缓中器153、非触摸输入缓冲器154、显示器缓冲器155、缓冲器156以及存储单元或寄存器157相连接的可编程处理单元151。触摸屏158通过触摸屏缓冲器153与处理器耦合。非触摸感测装置159,例如多个基于电容性的非接触式传感器,通过非触摸输入缓冲器154与处理器相耦合。
图形显示器160通过显示器缓冲器155耦合到处理器。显示装置160包括在其上呈现各种图像的显示屏。该非触摸传感器159位于将在随后更加详细地讨论的显示装置160的显示屏的外围的周围。
输入/输出装置161通过输入/输出缓冲器156耦合至处理器。输入/输出装置可以包括任何允许系统与外部信息交互的装置的组合。
存储单元157包括对处理器151实现非触摸交互系统所必须的信息和/或程序或可执行软件。例如,显示器控制软件157a可以以计算机可读形式存储于单元157中。其他的系统控制软件同样可以存储于单元157中。
附图2示出了通过处理器151来执行的系统100的各种软件模块200。模块200以磁性计算机或光学计算机可读形式存储于单元157中。软件200包括命令执行模块202、命令识别器模块204、数据接收器206、图形系统显示模块208和域(domain)模型210,其提供关于显示的区域的信息。各种模块的运行将相对于附图3的过程250来讨论。
如附图3的流程图中所示,在252,将来自例如传感器158或159的传感器的数据从缓冲器153、154载入到诸如206a、206b的各个接收器中。在254,将该数据载入到输入缓冲器204b中。
在256,手势分析器204a分析数据。在258,手势分析器204a传送系统命令到命令执行202。在260,该命令执行改变模型210的域对象的状态。
在262,命令执行202通知图形系统模块208去改变显示器160上的可视图像的状态。在264,图形系统模块208更新显示单元160上的图像。在266,命令执行模块202随后更新系统状态。
附图4A示出了本发明的一个实施例,非接触、导航区域显示器,例如可以用于估计建筑的报警条件。在第二实施例中,附图4B示出了根据本发明的小型显示器非接触导航。
附图5示出了在初始显示状态中的附图4a的实施例类型或应用的显示。在步骤1中,如所示,用户可以点击按钮去进入非触摸导航状态。附图6为在显示单元160上呈现给用户的确认屏幕。用户可以进入如在步骤3所示的非触摸导航状态。附图7示出了在非触摸导航状态时在显示单元160上的屏幕呈现。提供按钮来退出非触摸状态。
附图8A、8B、8C示出了关于显示单元160的屏幕160a的周边布置的非接触式传感器159的特征的各个部分。如本文所示,传感器159限定了外部截面圆锥型(frusto-conical)感测区域160b和内部区域160c。
当用户的手指或指示装置位于外部区域160b时,区域显示器或地图可以被导航或滚动以及缩放。当用户的手指靠近屏幕时,在区域160b和160c中,例如,呈现的图像从一个级别向更详细的级别来缩放。当用户的手指进入内部区域160c时,在一个实施例中,用户可以仅在地图或显示器上放大和缩小。区域160c能够帮助终端用户将地图或显示器平滑且没有抖动或跳动地放大/缩小。
附图9A、9B、9C示出了利用系统100的导航虚拟键盘的一个解决方案的步骤。可以通过用户去激活按键去选择和扩大区域。然后,可以通过用户去激活不同的按键去选择和扩大第二区域,等等,直到完成需要的进入。
附图10A、10B、10C示出了另一个利用系统100的导航虚拟键盘的解决方案的步骤。在附图10A、10B、10C的实施例中,可以扩大或放大用户手指移动到的键盘的任何部分以使得用户去无缝地激活连续按键组。应当理解本发明的实施例可以被包括在例如无线电话、移动电脑、或成像装置、一切可能具有相对小键盘的电子装置中。
综上所述,应当注意到在不脱离本发明的实质和范围的情况下多种变化和改变是有效的。应当理解没有意图或应当推断出关于本文所述的特定设备的限制。当然,其意图由所附的权利要求去覆盖落入该权利要求的范围之内的所有这类改变。
Claims (9)
1.一种显示系统,其包括:
图像能够在其表面呈现的多维显示装置;
位于该表面附近的多个非接触式传感器;以及
耦合到该传感器和该显示装置的控制电路,该电路建立该显示装置的表面附近的第一多维感测区域和第一多维感测区域内的第二多维感测区域,该第一多维感测区域具有预定形状,并且该第二多维感测区域具有不同于该第一多维感测区域的形状,并且该电路响应来自传感器的信号,确定向该表面的区域移动的指示部件的轨迹以及响应于其,改变该表面上的图像;以及
其中当指示部件在第一区域中时图像能够被滚动以及缩放并且当在第二区域中时图像仅仅能够被缩放。
2.如权利要求1所述的系统,其包括耦合到控制电路的显示控制电路,该显示控制电路响应所确定的轨迹来动态地调整呈现在该表面上的图像的放大参数。
3.如权利要求1所述的系统,其中该传感器包括电容型传感器。
4.如权利要求1所述的系统,其中将该传感器被配置为限定截面圆锥型区域,在该区域内指示部件能够被感测。
5.如权利要求1所述的系统,其包括手动操作元件,该手动操作元件耦合到在与该表面上的图像交互的非接触模式和接触类型模式之间切换的控制电路。
6.如权利要求1所述的系统,其中该控制电路响应于所选择的与该表面上的图像交互的接触类型模式或非接触模式之一。
7.如权利要求1所述的系统,其包括与显示装置的屏幕相关联的多个接触类型传感器。
8.如权利要求1所述的系统,其包括存储在耦合到该控制电路的计算机可读介质上的显示管理软件。
9.如权利要求8所述的系统,其中该显示管理软件在通过该控制电路执行时,响应于该指示部件的确定轨迹在显示单元的表面上呈现动态改变图像。
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US12/166,022 | 2008-07-01 | ||
| US12/166,022 US8443302B2 (en) | 2008-07-01 | 2008-07-01 | Systems and methods of touchless interaction |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| CN101699387A CN101699387A (zh) | 2010-04-28 |
| CN101699387B true CN101699387B (zh) | 2014-06-25 |
Family
ID=41172383
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN200910163965.1A Expired - Fee Related CN101699387B (zh) | 2008-07-01 | 2009-06-30 | 非触摸式交互系统及方法 |
Country Status (3)
| Country | Link |
|---|---|
| US (1) | US8443302B2 (zh) |
| EP (1) | EP2144147A3 (zh) |
| CN (1) | CN101699387B (zh) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10838504B2 (en) | 2016-06-08 | 2020-11-17 | Stephen H. Lewis | Glass mouse |
Families Citing this family (34)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2011033519A1 (en) * | 2009-09-21 | 2011-03-24 | Extreme Reality Ltd. | Methods circuits apparatus and systems for human machine interfacing with an electronic appliance |
| US20100202656A1 (en) * | 2009-02-09 | 2010-08-12 | Bhiksha Raj Ramakrishnan | Ultrasonic Doppler System and Method for Gesture Recognition |
| US10180746B1 (en) | 2009-02-26 | 2019-01-15 | Amazon Technologies, Inc. | Hardware enabled interpolating sensor and display |
| US9740341B1 (en) | 2009-02-26 | 2017-08-22 | Amazon Technologies, Inc. | Capacitive sensing with interpolating force-sensitive resistor array |
| DE102009032069A1 (de) * | 2009-07-07 | 2011-01-13 | Volkswagen Aktiengesellschaft | Verfahren und Vorrichtung zum Bereitstellen einer Benutzerschnittstelle in einem Fahrzeug |
| US9785272B1 (en) | 2009-07-31 | 2017-10-10 | Amazon Technologies, Inc. | Touch distinction |
| US9740340B1 (en) | 2009-07-31 | 2017-08-22 | Amazon Technologies, Inc. | Visually consistent arrays including conductive mesh |
| US8810524B1 (en) | 2009-11-20 | 2014-08-19 | Amazon Technologies, Inc. | Two-sided touch sensor |
| US20120095575A1 (en) * | 2010-10-14 | 2012-04-19 | Cedes Safety & Automation Ag | Time of flight (tof) human machine interface (hmi) |
| US8421752B2 (en) * | 2011-01-27 | 2013-04-16 | Research In Motion Limited | Portable electronic device and method therefor |
| CN103384872B (zh) * | 2011-02-22 | 2016-10-12 | 惠普发展公司,有限责任合伙企业 | 便于用户输入的方法和计算系统 |
| CN102693063B (zh) * | 2011-03-23 | 2015-04-29 | 联想(北京)有限公司 | 操作控制方法、装置及电子设备 |
| EP4303801A3 (en) | 2011-04-22 | 2024-02-21 | Pepsico Inc | Beverage dispensing system with social media capabilities |
| US9218704B2 (en) | 2011-11-01 | 2015-12-22 | Pepsico, Inc. | Dispensing system and user interface |
| US9594499B2 (en) * | 2012-02-21 | 2017-03-14 | Nokia Technologies Oy | Method and apparatus for hover-based spatial searches on mobile maps |
| CN103312861A (zh) * | 2012-03-06 | 2013-09-18 | 联想(北京)有限公司 | 控制方法、系统以及具有该系统的设备 |
| US8654076B2 (en) | 2012-03-15 | 2014-02-18 | Nokia Corporation | Touch screen hover input handling |
| US9310895B2 (en) | 2012-10-12 | 2016-04-12 | Microsoft Technology Licensing, Llc | Touchless input |
| US20140152566A1 (en) * | 2012-12-05 | 2014-06-05 | Brent A. Safer | Apparatus and methods for image/sensory processing to control computer operations |
| US9511988B2 (en) * | 2012-12-27 | 2016-12-06 | Lancer Corporation | Touch screen for a beverage dispensing system |
| US10304130B2 (en) | 2013-02-22 | 2019-05-28 | Cantor Futures Exchange, L.P. | Systems and methods of detecting manipulations on a binary options exchange |
| CN104102440B (zh) * | 2013-04-08 | 2018-05-25 | 华为技术有限公司 | 在用户界面压缩、解压缩文件的方法和装置 |
| WO2015022498A1 (en) * | 2013-08-15 | 2015-02-19 | Elliptic Laboratories As | Touchless user interfaces |
| PL2933743T3 (pl) * | 2014-04-18 | 2019-10-31 | Univ Rennes | Metoda opisywania dyfuzji molekularnej z zestawu ważonych dyfuzyjnie sygnałów obrazów rezonansu magnetycznego i urządzenia do wykonywania takiej metody |
| USD749115S1 (en) * | 2015-02-20 | 2016-02-09 | Translate Abroad, Inc. | Mobile device with graphical user interface |
| US11340710B2 (en) | 2016-06-08 | 2022-05-24 | Architectronics Inc. | Virtual mouse |
| CN106940608B (zh) * | 2017-03-07 | 2020-06-16 | Oppo广东移动通信有限公司 | 一种显示屏的控制方法、显示屏及电子设备 |
| US10747429B2 (en) | 2018-08-01 | 2020-08-18 | International Business Machines Corporation | Compensating for user hand tremors when using hand-held electronic devices |
| US12014120B2 (en) | 2019-08-28 | 2024-06-18 | MFTB Holdco, Inc. | Automated tools for generating mapping information for buildings |
| US10825247B1 (en) * | 2019-11-12 | 2020-11-03 | Zillow Group, Inc. | Presenting integrated building information using three-dimensional building models |
| US11676344B2 (en) | 2019-11-12 | 2023-06-13 | MFTB Holdco, Inc. | Presenting building information using building models |
| US12333655B2 (en) | 2019-11-12 | 2025-06-17 | MFTB Holdco, Inc. | Presenting building information using video and building models |
| WO2021237348A1 (en) * | 2020-05-25 | 2021-12-02 | Nz Technologies Inc. | Retrofit touchless interfaces for contact-based input devices |
| CN117113565B (zh) * | 2023-08-22 | 2024-09-13 | 鞍钢股份有限公司 | 一种非均匀结构管式换热器设计方法 |
Family Cites Families (19)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2648558B2 (ja) * | 1993-06-29 | 1997-09-03 | インターナショナル・ビジネス・マシーンズ・コーポレイション | 情報選択装置及び情報選択方法 |
| US7466307B2 (en) * | 2002-04-11 | 2008-12-16 | Synaptics Incorporated | Closed-loop sensor on a solid-state object position detector |
| JP2004038407A (ja) | 2002-07-01 | 2004-02-05 | Arcadia:Kk | 文字入力装置およびその方法 |
| US7656393B2 (en) * | 2005-03-04 | 2010-02-02 | Apple Inc. | Electronic device having display and surrounding touch sensitive bezel for user interface and control |
| US8460103B2 (en) * | 2004-06-18 | 2013-06-11 | Igt | Gesture controlled casino gaming system |
| US7606663B2 (en) | 2003-02-26 | 2009-10-20 | Tomtom International B.V. | Navigation device and method for exchanging data between resident applications |
| DE10310794B4 (de) * | 2003-03-12 | 2012-10-18 | Hewlett-Packard Development Co., L.P. | Bedieneinrichtung und Kommunikationsgerät |
| US20040183833A1 (en) * | 2003-03-19 | 2004-09-23 | Chua Yong Tong | Keyboard error reduction method and apparatus |
| EP1596271A1 (en) * | 2004-05-11 | 2005-11-16 | Hitachi Europe S.r.l. | Method for displaying information and information display system |
| GB0412787D0 (en) * | 2004-06-09 | 2004-07-14 | Koninkl Philips Electronics Nv | Input system |
| EP1769328A2 (en) * | 2004-06-29 | 2007-04-04 | Koninklijke Philips Electronics N.V. | Zooming in 3-d touch interaction |
| US7728825B2 (en) * | 2005-03-22 | 2010-06-01 | Microsoft Corporation | Targeting in a stylus-based user interface |
| US20060250376A1 (en) * | 2005-05-03 | 2006-11-09 | Alps Electric Co., Ltd. | Display device |
| DE102006037154A1 (de) * | 2006-03-27 | 2007-10-18 | Volkswagen Ag | Navigationsgerät und Verfahren zum Betreiben eines Navigationsgeräts |
| US7903094B2 (en) * | 2006-06-23 | 2011-03-08 | Wacom Co., Ltd | Information processing apparatus, operation input method, and sensing device |
| US20080055259A1 (en) * | 2006-08-31 | 2008-03-06 | Honeywell International, Inc. | Method for dynamically adapting button size on touch screens to compensate for hand tremor |
| US8316324B2 (en) * | 2006-09-05 | 2012-11-20 | Navisense | Method and apparatus for touchless control of a device |
| JP2008197934A (ja) * | 2007-02-14 | 2008-08-28 | Calsonic Kansei Corp | 操作者判別方法 |
| US8115753B2 (en) * | 2007-04-11 | 2012-02-14 | Next Holdings Limited | Touch screen system with hover and click input methods |
-
2008
- 2008-07-01 US US12/166,022 patent/US8443302B2/en active Active
-
2009
- 2009-06-25 EP EP09163833.8A patent/EP2144147A3/en not_active Withdrawn
- 2009-06-30 CN CN200910163965.1A patent/CN101699387B/zh not_active Expired - Fee Related
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10838504B2 (en) | 2016-06-08 | 2020-11-17 | Stephen H. Lewis | Glass mouse |
Also Published As
| Publication number | Publication date |
|---|---|
| EP2144147A2 (en) | 2010-01-13 |
| US20100005427A1 (en) | 2010-01-07 |
| EP2144147A3 (en) | 2013-07-03 |
| US8443302B2 (en) | 2013-05-14 |
| CN101699387A (zh) | 2010-04-28 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN101699387B (zh) | 非触摸式交互系统及方法 | |
| KR101541928B1 (ko) | 시각적 피드백 디스플레이 | |
| KR101384857B1 (ko) | 연속 줌 기능을 제공하는 사용자 인터페이스 방법들 | |
| KR101424294B1 (ko) | 터치스크린 장치의 사용자로부터 수신된 입력 및 제스쳐에 응답하여 동작을 수행하는 컴퓨터로 구현된 방법 및 컴퓨터판독가능 매체 | |
| KR101361214B1 (ko) | 터치스크린의 제어영역을 설정하는 인터페이스 장치 및 방법 | |
| EP2657811B1 (en) | Touch input processing device, information processing device, and touch input control method | |
| EP2708996A1 (en) | Display device, user interface method, and program | |
| US9459704B2 (en) | Method and apparatus for providing one-handed user interface in mobile device having touch screen | |
| US20140380209A1 (en) | Method for operating portable devices having a touch screen | |
| US20140062875A1 (en) | Mobile device with an inertial measurement unit to adjust state of graphical user interface or a natural language processing unit, and including a hover sensing function | |
| JP5620440B2 (ja) | 表示制御装置、表示制御方法及びプログラム | |
| WO2009142880A1 (en) | Proximity sensor device and method with subregion based swipethrough data entry | |
| CN103838507A (zh) | 触摸感应显示装置及其驱动方法 | |
| JP2004192241A (ja) | ユーザ・インタフェース装置および携帯情報装置 | |
| KR20110115683A (ko) | 터치스크린에서의 한 손 입력방법 | |
| JP2011081447A (ja) | 情報処理方法及び情報処理装置 | |
| KR20100136289A (ko) | 이동 단말기의 디스플레이 제어 방법 | |
| US9235338B1 (en) | Pan and zoom gesture detection in a multiple touch display | |
| KR101920864B1 (ko) | 터치스크린을 이용하여 이미지를 표시하기 위한 방법 및 단말 | |
| KR101403079B1 (ko) | 터치스크린에서 화면 줌 방법 및 이를 이용한 단말기 | |
| EP3433713B1 (en) | Selecting first digital input behavior based on presence of a second, concurrent, input | |
| KR101165388B1 (ko) | 이종의 입력 장치를 이용하여 화면을 제어하는 방법 및 그 단말장치 | |
| CN102789358A (zh) | 图像输出显示方法、装置以及一种显示设备 | |
| KR101136327B1 (ko) | 휴대 단말기의 터치 및 커서 제어방법 및 이를 적용한 휴대 단말기 | |
| KR20120078816A (ko) | 가상 터치 포인터 운용 방법 및 이를 지원하는 휴대 단말기 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| C06 | Publication | ||
| PB01 | Publication | ||
| C10 | Entry into substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| C14 | Grant of patent or utility model | ||
| GR01 | Patent grant | ||
| CF01 | Termination of patent right due to non-payment of annual fee | ||
| CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20140625 |