[go: up one dir, main page]

WO2001054110A1 - Systeme visuel d'interface homme/machine - Google Patents

Systeme visuel d'interface homme/machine Download PDF

Info

Publication number
WO2001054110A1
WO2001054110A1 PCT/US2001/001583 US0101583W WO0154110A1 WO 2001054110 A1 WO2001054110 A1 WO 2001054110A1 US 0101583 W US0101583 W US 0101583W WO 0154110 A1 WO0154110 A1 WO 0154110A1
Authority
WO
WIPO (PCT)
Prior art keywords
interface
image
user
projective
computer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/US2001/001583
Other languages
English (en)
Inventor
Camillo J. Taylor
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Pennsylvania Penn
Original Assignee
University of Pennsylvania Penn
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Pennsylvania Penn filed Critical University of Pennsylvania Penn
Priority to AU2001229572A priority Critical patent/AU2001229572A1/en
Publication of WO2001054110A1 publication Critical patent/WO2001054110A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3191Testing thereof
    • H04N9/3194Testing thereof including sensor feedback

Definitions

  • This invention relates generally to the field of vision-based human computer interface interactions Vision based interface ideas were proposed by Krueger (Artificial Reality 2,
  • the present invention embodies a novel system, method for its use, and article by which a user interacts with a vision-based human computer, in which traditional input and output devices, e.g., monitors, keyboards and mice, are replaced with augmented reality displays, projection systems and cameras.
  • User input is accomplished by projecting an image of the interface onto a flat surface, which is monitored with a video camera.
  • the relationship between the three surfaces of interest, the work surface, the virtual keyboard and the image obtained by the camera, is characterized by projective transformations of RP 2 . This observation leads to a fast and accurate online calibration algorithm.
  • imaging of the interface display interactively comprises a standard personal computer system; a projector attached to a VGA output port; and an image capturing system.
  • the image capture system and interface surface interact in an augmented reality display, wherein the projective transformations are computed from projective transformations of real projective plane P 2 based upon a set of fiducial markings on the interface surface.
  • imaging of the interface display interactively comprises a computer-readable signal-bearing medium; means in the medium for specifying a virtual user interface without physical instantiation; and means in the medium for characterizing relationship interaction between work surface, virtual keyboard and projected image of the virtual keyboard by projective transformations of real projective plane RP 2 .
  • imaging of the interface display further comprises means in the medium for projecting an image, wherein said projector is attached to a VGA output port; and means for capturing said projected image.
  • the means for image capture system and interface surface interact in an augmented reality display, wherein the projective transformations are computed from projective transformations of real projective plane P 2 which is based upon a set of fiducial markings on the interface surface.
  • the projective transformations are computed from at least four distinct, non-colinear point correspondences between frame and image buffers.
  • the interface characteristics such as size, color, position and layout, are highly flexible, and subject to reconfiguration by the user.
  • Substantially any smooth, flat surface onto which a projected image can be visualized may be used as the interface surface.
  • An advantage of the vision based interaction technique of the invention is that it requires no mechanical input devices, such as keyboards, mice or touch screens. There are no moving parts and no wires to connect to the interface surface. By avoiding a physical instantiation of the interface, a level of abstraction is gained which can be exploited in a number of ways. The system designer is given the flexibility to specify the layout and action of the user interface entirely in software, without being constrained by a fixed mechanical interface. Thus, interfaces can be customized to the requirements and capabilities of individual users.
  • the article and system are very amendable to miniaturization, thereby permitting interesting applications in the field of wearable computer systems.
  • the same article and system can be scaled up or down to very large or very small interfaces, a degree of flexibility that cannot be matched by monitor based systems, which are restricted to a fixed size.
  • FIG. 1 is a schematic diagram of the components of the projector based interface scheme.
  • FIG. 2 is a block diagram of the projector based interface system.
  • FIGs. 3 A and 3B depict images of the virtual keyboard.
  • FIG. 3 A depicts the frame buffer containing the image of the virtual calculator keypad that is projected onto the interface surface.
  • FIG. 3B depicts the image of the interface surface acquired with the video camera.
  • FIG. 4 is a diagram showing how projective transformations relate corresponding points on the virtual keyboard with those on the interface surface and the image buffer.
  • FIG. 5 is a block diagram of the augmented reality interface system.
  • FIGs. 6A-F depicts both the images acquired by the video camera (FIGs. 6A, 6B and
  • FIGs. 6C the corresponding augmented reality displays produced by the system
  • FIGs. 6D, 6E and 6F the corresponding augmented reality displays produced by the system
  • the user is able to select one of the three shapes for display by "pressing" the corresponding button.
  • a square shape is depicted in FIGs. 6A and 6D; a cross shape is depicted in FIGs. 6B and 6E, and a triangle shape is depicted in FIGs. 6C and 6F.
  • the present invention provides systems and articles, and method for using same, by which techniques related to computer vision and augmented reality are employed to develop novel vision-based human computer interfaces with significantly greater flexibility and functionality, in which traditional input and output devices, monitors, keyboards and mice, are replaced with augmented reality displays, projection systems and cameras.
  • augmented reality displays projection systems and cameras.
  • the invention exploits the fact that the relationship between the three surfaces of interest, the work surface, the virtual keyboard and the image obtained by the camera, can be characterized by projective transformations of RP 2 .
  • This observation leads to a fast and accurate online calibration algorithm.
  • the availability of such a real-time, online calibration scheme opens the way for the use of augmented reality displays, in which the image of the interface is composited with the video imagery.
  • the calibration system is used to compensate for changes in the relationships between the camera and the interface surface of the types which occur when either the camera or the interface surface is moved.
  • commonly available graphics accelerators are used to expedite some of the image manipulation operations required by the present interface scheme, so that real-time performance can be achieved on standard PCs.
  • the systems or articles provide flexibility to the designer allowing the layout and action of the user interface to be specified entirely in software, without being constrained by a fixed mechanical interface. This flexibility permits the interfaces to be customized to the requirements and capabilities of the individual user. Just as a graphical user interface can be programmed to present a number of different interfaces on the same computer, the present invention permits the user to arbitrarily reconfigure the interface.
  • the size, color, position and layout of the interface elements can all be changed in software to reflect individual needs and tastes. Different interfaces are employed for different tasks, in the same way that different GUI's are presented for different programs.
  • the invention permits interfaces to be individually developed for users who suffer from repetitive stress disorders, such as carpel tunnel syndrome, which be caused or exacerbated from the inflexible arrangement of standard interface devices.
  • the scheme is very amenable to either scale-up or scale-down to very large or very small interfaces, providing a degree of flexibility that cannot be matched by monitor based systems, which are restricted to a fixed size.
  • the interface scheme is particularly amenable to miniaturization, which makes possible a variety of interesting applications in the field of wearable computer systems.
  • an image of a calculator keypad can be projected onto a screen as diagramed in FIG. 1. Then by analyzing the images of the screen acquired with a video camera, the system is able to determine what the user is indicating on the virtual keyboard, as depicted in FIG. 2, and to respond appropriately.
  • a system is outlined schematically in FIG. 3. Effectively, the projector and camera systems acting in concert form a feedback system in which user interaction is effected by occluding various parts of the projected image.
  • the system presents to the user an augmented reality display in which an image of the virtual keyboard is overlaid onto a region of the interface surface. A block diagram of the augmented reality interface system is presented in FIG. 5.
  • a miniaturized camera could be mounted on the glasses in such a way that the video imagery closely approximates the user's viewpoint. Inventories could be quickly assessed while walking through a store. Ultimately fully functional computers with sophisticated interface capabilities, will be small enough to easily carry in a pocket.
  • the projection based Virtual Keyboard system has already been adapted and tested by the inventor's laboratory for use on a 'smart wheelchair.”
  • the interface allowed the user to make selections from virtual buttons projected onto the tray on the wheelchair.
  • the interface surface can be any substantially smooth, flat surface, preferably white or nearly white in color. If the surface can be used to clearly view a projected photograph or overhead transparency, it can also be successfully used for the images related to the present invention.
  • variations can be managed by corresponding, calibration algorithmic solutions, which may be developed by one of ordinary skill in the art without undue experimentation, less variation in the interface surface will permit reliable projective transformations.
  • the greater the reliability of the projected image on the interface surface the greater the reliability of the position of each point on the screen to the coordinates of its projection on the video image.
  • the interface surface could be a simple, smooth sheet of white paper, such as the one shown in FIG. 6, comprising a pattern of fiducial marks that are readily recognized and tracked in the video imagery.
  • the user is presented with an interface to the computing system, like the one shown in FIG. 6, whenever he/she looks at the interface surface.
  • the virtual keyboard appears registered to the interface surface, creating the illusion of a keypad without the need for a physical interface.
  • One advantage of such a scheme is that the entire input/output system is contained on the headset, thereby eliminating the need for bulky keyboards, display systems, cables or wiring. Consequently, the computer itself can be miniaturized to the size of a cellular telephone.
  • modulated light as an input mechanism is that the system is essentially independent of scale.
  • the interface can be made as large or as small as needed, by simply changing the relative positions of (distance between) the projector and the interface surface.
  • the same system can be used to place the interface on a sheet of paper, on the surface of a drafting table, or on an entire wall.
  • This capacity is particularly useful and advantageous in immersive virtual reality environments, because a designer would be able to place interface elements on any convenient, suitable surface in the environment.
  • Virtual light switches can be projected onto walls, virtual telephone keypads can be projected onto table tops, and virtual displays can be projected onto desktops.
  • One of the interesting advantages presented by the embodied interface scheme is the ability to leverage the considerable bandwidth available in the video signal to implement more sophisticated interfaces than are currently possible with a traditional keyboard and mouse system.
  • the typical computer keyboard contains approximately 100 keys, but only one keycode can be transmitted to the computer at a time.
  • a single video image contains roughly a quarter of a million pixels, and all of the intensity measurements are acquired in parallel.
  • one approach to exploiting this bandwidth is by designing interfaces in which the user is presented with a large variety of symbols from which to select by occluding different combinations of regions on the interface, or even on each 'button' itself.
  • a virtual keyboard containing 10 key regions (one for each finger), wherein the user would be able to select from 1024 different symbols by covering and uncovering various, different combinations of keys.
  • This could be visualized in terms of the interface presented by the keyboard of an organ, from which a musician is able to invoke a wide range of sounds by depressing different sets of keys, the components of which are not mutually exclusive of each other.
  • Such a human computer interaction system might be particularly useful to persons with a limited range of motion, since subtle variations in the pattern of occlusion on a virtual keyboard caused by small motions, can be indexed into a vocabulary of thousands of symbols.
  • vision based human computer interaction systems and articles are presented in which the user indicates his/her intention by occluding or disoccluding portions of an interface surface, as exemplified by the following prototype systems, one of which uses a standard computer projection system and another which presents an augmented reality display to the user.
  • Example 1 Projection-Based System to Display an Image of the Interface.
  • the setup for the prototype implementation of the first preferred embodiment of the vision-based interaction system comprises 3 primary elements to display an image of the interface to the user: (i) a standard personal computer system, (ii) a projector, which is attached to the VGA output port, and (iii) an image capture system.
  • FIG. 2 is a block diagram of the computational system that underlies the projector based user interface scheme.
  • FIG. 3 depicts the image of a virtual calculator keypad.
  • FIG. 3 A the interface stored in the frame buffer is projected onto the screen
  • FIG. 3B the image of the screen is acquired with a video camera. Coordinate frames of reference can be attached to the frame buffer, the image buffer and/or the interface surface in the usual manner.
  • H ⁇ is computed from the four point correspondences using standard techniques, e.g., Faugeras, Three-Dimensional Computer Vision, MIT Press, 1993.
  • this matrix H has the property that p, ⁇ H e, for all i.
  • H Given two sets of points, p, and q court a projective transformation, H, can be constructed, which maps p, onto q, by constructing the projective transformations, H / and H 2 , that map the standard basis onto p, and q ;> respectively, and then composing the transformations as follows:
  • this projective transformation can be computed without any a'priori knowledge of the intrinsic parameters of the camera, or of the geometric relationships between the projector, the interface surface and the camera.
  • the system can be made to calibrate itself automatically by projecting fixed patterns on the screen, which can be recognized and localized in the imagery acquired with the video camera. If fiducial marks on the interface surface are available, one can compute the projective transformation between the image buffer and the interface surface in a similar manner.
  • This projective transformation can be used to apply a 'keystoning correction' to the frame buffer, so that the projection of the virtual keyboard is properly aligned with the interface surface. In the OpenGL graphics pipeline, this keystoning correction can be implemented quite efficiently by manipulating the projection and viewing matrices appropriately.
  • the estimate for the projective transformation between the frame buffer containing the image of the interface and the image buffer is used to apply a projective rectification to the video imagery obtained by the camera.
  • This projective rectification is accomplished, for example, quite expeditiously by utilizing the known texture mapping capabilities of modern graphics accelerators.
  • These systems can be programmed to perform arbitrary projective transformations on the image buffer at frame rate without burdening the system CPU.
  • the system detects the user's interaction with the surface by analyzing the differences between the image of the virtual keyboard in its frame buffer and the rectified image.
  • the system constructs a mapping between color or intensity values projected onto the interface surface and the corresponding color or intensity value that is measured by the camera system. From this mapping the system is able to determine when a particular pixel in the rectified image differs significantly from the expected color or intensity value.
  • the end result of this analysis is a binary image where pixels that differ significantly from their expected values are marked with a 1.
  • this binary image is constructed by simply computing the difference between the current image and a fixed background image.
  • the system interprets the user's intent by analyzing the pattern of occlusion in the image.
  • the Virtual Calculator system (the embodiment wherein the virtual keyboard is a calculator keypad as shown in FIG. 3), each of the virtual keys is divided into two regions as shown. A "keypress" is detected when the central region is sufficiently occluded, while the peripheral region is left untouched. This scheme allows the system to distinguish the situation in which the user is simply reaching over one key to point to another, since in this case both regions of the virtual keys that the user is reaching over will be fully or partially occluded.
  • the system could, as an alternative embodiment, be made to adapt to changes in the ambient lighting conditions. In another alternative, it could also employ other cues, such as shape and motion, to improve the detection of the user's hand position.
  • Example 2 Augmented Reality System for Presenting an Image of the Interface
  • FIG. 5 A block diagram of the augmented reality system is provided in Figure 5. This system is similar to the system described in Example 1, except the projector has been replaced by an augmented reality display.
  • the relationship between the camera and the interface surface is allowed to change over time. This means that for every image in the video sequence the system must recompute the projective transformation between the interface surface and the image buffer.
  • One way in which this can be accomplished is by tracking the position of a set of fiducial markings on the interface surface in the video imagery, and performing the computation described above in Equations 3, 4 and 5, wherein projective transformations are computed from point correspondences. Once the projective transformation has been calculated, it is used to produce an augmented reality display where an image of the virtual keyboard is composited with the video image, so that the interface appears in the correct position on the video image.
  • FIGs. 6A through F present the images obtained with the video camera.
  • FIGs. 6D-6F present the augmented reality displays provided to the user.
  • the user is given the opportunity to select one of the three shapes for display (a square in FIGs. 6A and 6D; a cross in FIGs. 6B and 6E; or a triangle in FIGs. 6C and 6F) by "pressing" the corresponding button.
  • the projective transformation is also used to apply a projective rectification to the region of the video imagery that corresponds to the interface surface. Then, the rectified image is analyzed to determine the user's interaction. Note that in this case, the task of computing a binary image, which indicates that portions of the interface are occluded, is straightforward since it simply amounts to locating dark objects against a lighter background. Such a task is well within the capability of one skilled in the art using standard modern computer vision techniques.
  • the user's intention can be inferred by analyzing the pattern of occlusion.
  • the system shown in FIG. 6 the system effectively detected which buttons were 'pressed,' i.e., which shapes were selected, based on which shape was occluded, and the appropriate pattern was then displayed in the selection box, proving the present vision-based human computer interaction system to be useful, reliable and effective.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne une nouvelle approche d'une interaction visuelle homme/machine dans laquelle les dispositifs d'entrée et de sortie classiques, par exemple, les moniteurs, les claviers, les écrans tactiles, et la souris, sont remplacés par des affichages qui augmentent la réalité (affichage à réalité augmentée), par des systèmes de projection (projecteur) et par des caméras. L'entrée utilisateur s'effectue par projection d'une image de l'interface sur une surface plane (surface d'interface), surveillée à l'aide d'une caméra vidéo. La relation entre les trois surfaces considérées, c'est-à-dire, la surface de travail, le clavier virtuel, et l'image obtenue par la caméra, peut être caractérisée par des transformations projectives de RP2, qui conduisent à un algorithme d'étalonnage en ligne rapide et précis.
PCT/US2001/001583 2000-01-18 2001-01-18 Systeme visuel d'interface homme/machine Ceased WO2001054110A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2001229572A AU2001229572A1 (en) 2000-01-18 2001-01-18 Vision-based human computer interface system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US17653400P 2000-01-18 2000-01-18
US60/176,534 2000-01-18

Publications (1)

Publication Number Publication Date
WO2001054110A1 true WO2001054110A1 (fr) 2001-07-26

Family

ID=22644738

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2001/001583 Ceased WO2001054110A1 (fr) 2000-01-18 2001-01-18 Systeme visuel d'interface homme/machine

Country Status (2)

Country Link
AU (1) AU2001229572A1 (fr)
WO (1) WO2001054110A1 (fr)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003051045A1 (fr) * 2001-12-10 2003-06-19 Mitsubishi Denki Kabushiki Kaisha Procede d'etalonnage d'un projecteur et d'une camera
WO2004070485A1 (fr) * 2003-02-03 2004-08-19 Siemens Aktiengesellschaft Projection d'information synthetique
EP1369769A3 (fr) * 2002-06-06 2006-05-17 Siemens Corporate Research, Inc. Système et méthode pour mesurer la précision d'enregistrement d'un système de réalité augmentée
US7242388B2 (en) 2001-01-08 2007-07-10 Vkb Inc. Data input device
DE102005001417B4 (de) * 2004-01-29 2009-06-25 Heidelberger Druckmaschinen Ag Projektionsflächenabhängige Anzeige-/Bedienvorrichtung
US7755608B2 (en) 2004-01-23 2010-07-13 Hewlett-Packard Development Company, L.P. Systems and methods of interfacing with a machine
US8037414B2 (en) 2006-09-14 2011-10-11 Avaya Inc. Audible computer user interface method and apparatus
WO2012005438A3 (fr) * 2010-07-09 2012-03-01 (주)디스트릭트홀딩스 Procédé et système d'affichage de publicité multimédia utilisant un écran tactile et un projecteur
CN103106665A (zh) * 2011-11-11 2013-05-15 周建龙 一种空间增强现实系统中自动跟踪移动物体的方法
WO2014101955A1 (fr) * 2012-12-28 2014-07-03 Metaio Gmbh Procédé et système pour projeter des informations numériques sur un objet réel dans un environnement réel
US10057730B2 (en) 2015-05-28 2018-08-21 Motorola Solutions, Inc. Virtual push-to-talk button

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5528263A (en) * 1994-06-15 1996-06-18 Daniel M. Platzker Interactive projected video image display system
US6005547A (en) * 1995-10-14 1999-12-21 Xerox Corporation Calibration of an interactive desktop system
US6147678A (en) * 1998-12-09 2000-11-14 Lucent Technologies Inc. Video hand image-three-dimensional computer interface with multiple degrees of freedom
US6198485B1 (en) * 1998-07-29 2001-03-06 Intel Corporation Method and apparatus for three-dimensional input entry

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5528263A (en) * 1994-06-15 1996-06-18 Daniel M. Platzker Interactive projected video image display system
US6005547A (en) * 1995-10-14 1999-12-21 Xerox Corporation Calibration of an interactive desktop system
US6198485B1 (en) * 1998-07-29 2001-03-06 Intel Corporation Method and apparatus for three-dimensional input entry
US6147678A (en) * 1998-12-09 2000-11-14 Lucent Technologies Inc. Video hand image-three-dimensional computer interface with multiple degrees of freedom

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7893924B2 (en) 2001-01-08 2011-02-22 Vkb Inc. Data input device
US7242388B2 (en) 2001-01-08 2007-07-10 Vkb Inc. Data input device
WO2003051045A1 (fr) * 2001-12-10 2003-06-19 Mitsubishi Denki Kabushiki Kaisha Procede d'etalonnage d'un projecteur et d'une camera
EP1369769A3 (fr) * 2002-06-06 2006-05-17 Siemens Corporate Research, Inc. Système et méthode pour mesurer la précision d'enregistrement d'un système de réalité augmentée
US7377650B2 (en) 2003-02-03 2008-05-27 Siemens Aktiengesellschaft Projection of synthetic information
WO2004070485A1 (fr) * 2003-02-03 2004-08-19 Siemens Aktiengesellschaft Projection d'information synthetique
US7755608B2 (en) 2004-01-23 2010-07-13 Hewlett-Packard Development Company, L.P. Systems and methods of interfacing with a machine
DE102005001417B4 (de) * 2004-01-29 2009-06-25 Heidelberger Druckmaschinen Ag Projektionsflächenabhängige Anzeige-/Bedienvorrichtung
US7860587B2 (en) 2004-01-29 2010-12-28 Heidelberger Druckmaschinen Ag Projection-area dependent display/operating device
US8037414B2 (en) 2006-09-14 2011-10-11 Avaya Inc. Audible computer user interface method and apparatus
WO2012005438A3 (fr) * 2010-07-09 2012-03-01 (주)디스트릭트홀딩스 Procédé et système d'affichage de publicité multimédia utilisant un écran tactile et un projecteur
CN103106665A (zh) * 2011-11-11 2013-05-15 周建龙 一种空间增强现实系统中自动跟踪移动物体的方法
WO2014101955A1 (fr) * 2012-12-28 2014-07-03 Metaio Gmbh Procédé et système pour projeter des informations numériques sur un objet réel dans un environnement réel
US10819962B2 (en) 2012-12-28 2020-10-27 Apple Inc. Method of and system for projecting digital information on a real object in a real environment
US11652965B2 (en) 2012-12-28 2023-05-16 Apple Inc. Method of and system for projecting digital information on a real object in a real environment
US10057730B2 (en) 2015-05-28 2018-08-21 Motorola Solutions, Inc. Virtual push-to-talk button

Also Published As

Publication number Publication date
AU2001229572A1 (en) 2001-07-31

Similar Documents

Publication Publication Date Title
Zhang et al. Visual panel: virtual mouse, keyboard and 3D controller with an ordinary piece of paper
CN110753898B (zh) 沉浸式环境内与虚拟对象的基于悬停的用户交互
US6803928B2 (en) Extended virtual table: an optical extension for table-like projection systems
Hachet et al. A camera-based interface for interaction with mobile handheld computers
US7755608B2 (en) Systems and methods of interfacing with a machine
CN116724285A (zh) 用于控制虚拟和图形元素的微手势
US20100128112A1 (en) Immersive display system for interacting with three-dimensional content
CN114080585A (zh) 在人工现实环境中使用外围设备的虚拟用户界面
US20150062004A1 (en) Method and System Enabling Natural User Interface Gestures with an Electronic System
CN120469584A (zh) 用于操纵虚拟对象的方法
WO2024226681A1 (fr) Procédés d'affichage et de repositionnement d'objets dans un environnement
JP2007527573A (ja) 光入力デバイス用の装置及びその方法
CN114138106B (zh) 混合虚拟现实桌面计算环境中的状态之间的转换
WO1999040562A1 (fr) Systeme d'ecran tactile pour ordinateur relie a une camera video
Kim et al. Interaction with hand gesture for a back-projection wall
US11049306B2 (en) Display apparatus and method for generating and rendering composite images
CN112657176A (zh) 一种结合人像行为信息的双目投影人机交互方法
WO2001054110A1 (fr) Systeme visuel d'interface homme/machine
JP2025104251A (ja) 実物感のあるタイピング又はタッチの実現方法
JP2025131492A (ja) 3次元カーソルによる仮想タッチ方法、記憶媒体及びチップ
TW201913298A (zh) 可顯示實體輸入裝置即時影像之虛擬實境系統及其控制方法
Zhang Vision-based interaction with fingers and papers
Kemmoku et al. AR tabletop interface using a head-mounted projector
Taylor Virtual keyboards
Sato et al. Video-based tracking of user's motion for augmented desk interface

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AU CA JP US

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR

121 Ep: the epo has been informed by wipo that ep was designated in this application
122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: JP