[go: up one dir, main page]

Lee et al., 2013 - Google Patents

Real-time perception-level translation from audio signals to vibrotactile effects

Lee et al., 2013

Document ID
11149913301066308340
Author
Lee J
Choi S
Publication year
Publication venue
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems

External Links

Snippet

In this paper, we propose a real-time perception-level audio-to-vibrotactile translation algorithm. Unlike previous signal-level conversion methods, our algorithm considers only perceptual characteristics, such as loudness and roughness, of audio and tactile stimuli …
Continue reading at dl.acm.org (other versions)

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user

Similar Documents

Publication Publication Date Title
Lee et al. Real-time perception-level translation from audio signals to vibrotactile effects
US10477202B2 (en) Method and apparatus for encoding and decoding haptic information in multi-media files
Turchet et al. Haptic feedback for enhancing realism of walking simulations
KR101641418B1 (en) Method for haptic signal generation based on auditory saliency and apparatus therefor
Larsson et al. The actor-observer effect in virtual reality presentations
Frid Accessible digital musical instruments-a survey of inclusive instruments
CN113318432B (en) Music control method in game, nonvolatile storage medium and electronic device
Mazzoni et al. How does it feel like? An exploratory study of a prototype system to convey emotion through haptic wearable devices
Yun et al. Generating real-time, selective, and multimodal haptic effects from sound for gaming experience enhancement
Jung et al. HapMotion: motion-to-tactile framework with wearable haptic devices for immersive VR performance experience
Li et al. Towards context-aware automatic haptic effect generation for home theatre environments
Aker et al. Effect of audio-tactile congruence on vibrotactile music enhancement
Okazaki et al. The effect of frequency shifting on audio–tactile conversion for enriching musical experience
Fanger et al. PIANX–A platform for piano players to alleviate music performance anxiety using mixed reality
Chelladurai et al. SoundHapticVR: Head-Based Spatial Haptic Feedback for Accessible Sounds in Virtual Reality for Deaf and Hard of Hearing Users
Kim et al. Sound-to-touch crossmodal pitch matching for short sounds
CN106507144A (en) A kind of choosing method of the background music based on spectators and system
Balandra et al. Haptic Music Player---Synthetic audio-tactile stimuli generation based on the notes' pitch and instruments' envelope mapping
Alma et al. Preliminary study of upper-body haptic feedback perception on cinematic experience
Bartles Mastering Electronic Dance Music: An Essential Guide for EDM Producers
Karam et al. Designing a model human cochlea: Issues and challenges in crossmodal audio-haptic displays
Ahn et al. Enhancing Video Experiences for DHH Individuals through Sound-Inspired Motion Caption-based Spatiotemporal Tacton
Sikström et al. Participatory amplitude level adjustment of gesture controlled upper body garment sound in immersive virtual reality
Saroka The Role of Sound in the Immersive Experience
Corporal Big Picture, Big Sound: Investigating Headphone and Loudspeaker Frameworks to Enhance Users' Sense of Presence and Emotional State in Virtual Reality