Khazaei, 2024 - Google Patents
Real-Time Gesture-Based Sound Control SystemKhazaei, 2024
View PDF- Document ID
- 1409865849213341413
- Author
- Khazaei M
- Publication year
External Links
Snippet
This thesis presents a real-time, human-in-the-loop music control and manipulation system that dynamically adapts audio outputs based on the analysis of human movement captured via live-stream video. This project creates a responsive link between visual and auditory …
- 238000000034 method 0 abstract description 5
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for entering handwritten data, e.g. gestures, text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03547—Touch pads, in which fingers can move on a surface
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06K—RECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K9/00—Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
- G06K9/00335—Recognising movements or behaviour, e.g. recognition of gestures, dynamic facial expressions; Lip-reading
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06N—COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N99/00—Subject matter not provided for in other groups of this subclass
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T13/00—Animation
- G06T13/20—3D [Three Dimensional] animation
- G06T13/40—3D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| Fiebrink | Real-time human interaction with supervised learning algorithms for music composition and performance | |
| Wheatland et al. | State of the art in hand and finger modeling and animation | |
| Françoise et al. | Motion-sound mapping through interaction: An approach to user-centered design of auditory feedback using machine learning | |
| Gillian | Gesture recognition for musician computer interaction | |
| Visi et al. | Interactive machine learning of musical gesture | |
| Bernardo et al. | O soli mio: exploring millimeter wave radar for musical interaction | |
| Di Donato et al. | Myo Mapper: a Myo armband to OSC mapper | |
| Shen et al. | Application of human posture recognition and classification in performing arts education | |
| Gillian et al. | A machine learning toolbox for musician computer interaction | |
| Dalmazzo et al. | A machine learning approach to violin bow technique classification: a comparison between imu and mocap systems | |
| Fontana et al. | Walking with the Senses | |
| Vyas et al. | Gesture recognition and control | |
| Ma | Application of social entertainment robot based on sensor and edge computing in collaborative music performance | |
| Overholt et al. | A multimodal system for gesture recognition in interactive music performance | |
| Jiang et al. | Sensor based dance coherent action generation model using deep learning framework | |
| Khazaei | Real-Time Gesture-Based Sound Control System | |
| Côté-Allard et al. | Towards the use of consumer-grade electromyographic armbands for interactive, artistic robotics performances | |
| Rhodes et al. | Classifying biometric data for musical interaction within virtual reality | |
| Camurri et al. | Expressive gesture and multimodal interactive systems | |
| Shetty et al. | Automatic Music Control Using Image Processing and MediaPipe | |
| Mamodiya et al. | An Adaptive Human-Robot Interaction Framework Using Real-Time Emotion Recognition and Context-Aware Task Planning | |
| Rose | Salto: a system for musical expression in the aerial arts. | |
| Khazaei et al. | A Real-Time Gesture-Based Control Framework | |
| Baltazar et al. | Zatlab: A gesture analysis system to music interaction | |
| Jessop | A gestural media framework: Tools for expressive gesture recognition and mapping in rehearsal and performance |