Funk et al., 2020 - Google Patents
Sonification of facial actions for musical expressionFunk et al., 2020
View PDF- Document ID
- 7195529590219701191
- Author
- Funk M
- Kuwabara K
- Lyons M
- Publication year
- Publication venue
- arXiv preprint arXiv:2010.03223
External Links
Snippet
The central role of the face in social interaction and non-verbal communication suggests we explore facial action as a means of musical expression. This paper presents the design, implementation, and preliminary studies of a novel system utilizing face detection and optic …
- 230000001815 facial 0 title abstract description 53
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS
- G10H2210/00—Aspects or methods of musical processing having intrinsic musical character, i.e. involving musical theory or musical parameters or relying on musical knowledge, as applied in electrophonic musical tools or instruments
- G10H2210/031—Musical analysis, i.e. isolation, extraction or identification of musical elements or musical parameters from a raw acoustic signal or from an encoded audio signal
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T13/00—Animation
- G06T13/20—3D [Three Dimensional] animation
- G06T13/40—3D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS
- G10H1/00—Details of electrophonic musical instruments
- G10H1/0008—Associated control or indicating means
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Funk et al. | Sonification of facial actions for musical expression | |
Lyons et al. | 2003: Designing, Playing, and Performing with a Vision-Based Mouth Interface | |
Zhou et al. | Dance and choreography in HCI: a two-decade retrospective | |
Camurri et al. | Communicating expressiveness and affect in multimodal interactive systems | |
US7053915B1 (en) | Method and system for enhancing virtual stage experience | |
US8654250B2 (en) | Deriving visual rhythm from video signals | |
Livingstone et al. | Facial expressions and emotional singing: A study of perception and production with motion capture and electromyography | |
Wöllner et al. | The perception of prototypical motion: synchronization is enhanced with quantitatively morphed gestures of musical conductors. | |
CN107464572B (en) | Multi-mode interactive music perception system and control method thereof | |
Volpe et al. | A system for embodied social active listening to sound and music content | |
Sparacino et al. | Augmented performance in dance and theater | |
Camurri et al. | The MEGA project: Analysis and synthesis of multisensory expressive gesture in performing art applications | |
Niewiadomski et al. | Rhythmic body movements of laughter | |
Colling et al. | Music, action, and affect | |
Glowinski et al. | Evaluating music performance and context-sensitivity with Immersive Virtual Environments. | |
Mancini et al. | A virtual head driven by music expressivity | |
Fabiani et al. | Systems for interactive control of computer generated music performance | |
Kim et al. | Perceptually motivated automatic dance motion generation for music | |
Takehana et al. | Audiovisual synchrony perception in observing human motion to music | |
Valenti et al. | Facial expression recognition as a creative interface | |
Bevacqua et al. | Real and virtual body percussionists interaction | |
Baltazar et al. | Zatlab: A gesture analysis system to music interaction | |
Woolford et al. | Particulate matters: generating particle flows from human movement | |
Lee et al. | Express it! An Interactive System for Visualizing Expressiveness of Conductor's Gestures | |
Landry et al. | A broad spectrum of sonic interactions at immersive interactive sonification platform (iISoP) |