Kim, 2019 - Google Patents
Exploration of foot-based interaction for menu control and virtual reality applicationsKim, 2019
View PDF- Document ID
- 9613082525076127959
- Author
- Kim T
- Publication year
External Links
Snippet
Although hand-based interaction is dominant in the field of human-computer interaction, the hands can be occupied during various daily activities, such as driving, and may be unavailable for those with motor impairments. To support or replace some functions of a …
- 230000003993 interaction 0 title abstract description 213
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F19/00—Digital computing or data processing equipment or methods, specially adapted for specific applications
- G06F19/30—Medical informatics, i.e. computer-based analysis or dissemination of patient or disease data
- G06F19/34—Computer-assisted medical diagnosis or treatment, e.g. computerised prescription or delivery of medication or diets, computerised local control of medical devices, medical expert systems or telemedicine
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B23/00—Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes
- G09B23/28—Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine
- G09B23/285—Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine for injections, endoscopy, bronchoscopy, sigmoidscopy, insertion of contraceptive devices or enemas
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Detecting, measuring or recording for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1116—Determining posture transitions
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| Velloso et al. | The feet in human--computer interaction: A survey of foot-based interaction | |
| Williams et al. | Evaluation of walking in place on a wii balance board to explore a virtual environment | |
| Lin et al. | Haptic rendering: foundations, algorithms, and applications | |
| Pai et al. | Armswing: Using arm swings for accessible and immersive navigation in ar/vr spaces | |
| Bérard et al. | Did “Minority Report” get it wrong? Superiority of the mouse over 3D input devices in a 3D placement task | |
| US8169401B2 (en) | Haptic interface | |
| Otaran et al. | Haptic ankle platform for interactive walking in virtual reality | |
| Ushiyama et al. | Feetthrough: electrotactile foot interface that preserves real-world sensations | |
| JP2019512299A (en) | Apparatus and method for movement tracking and simulation | |
| Feintuch et al. | Integrating haptic-tactile feedback into a video-capture–based virtual environment for rehabilitation | |
| Kölsch et al. | The postural comfort zone for reaching gestures | |
| Saunders et al. | The performance of indirect foot pointing using discrete taps and kicks while standing. | |
| Xavier et al. | Pseudo-haptics survey: Human-computer interaction in extended reality and teleoperation | |
| Durlach et al. | Effect of variations in sensory feedback on performance in a virtual reaching task | |
| Levin et al. | Validity of virtual reality environments for sensorimotor rehabilitation | |
| Kato et al. | Force rendering and its evaluation of a friction-based walking sensation display for a seated user | |
| Kim et al. | Pressure or movement? Usability of multi-functional foot-based interfaces | |
| Kim | Exploration of foot-based interaction for menu control and virtual reality applications | |
| Balderas et al. | A makerspace foot pedal and shoe add-on for seated virtual reality locomotion | |
| Kim et al. | Usability of foot-based interaction techniques for mobile solutions | |
| Visell et al. | Contact sensing and interaction techniques for a distributed, multimodal floor display | |
| Coe et al. | Generating localized haptic feedback over a spherical surface | |
| Chen et al. | Dynamic touch‐enabled virtual palpation | |
| Wang et al. | Virtueledent: A compact xr tooth-cutting training system using a physical emr-based dental handpiece and teeth model | |
| Iwata et al. | Array force display for hardness distribution |