Authors
Xiaoming Chen, Zeke Zexi Hu, Guangxin Zhao, Haisheng Li, Vera Chung, Aaron Quigley
Publication date
2024/1/31
Journal
IEEE Transactions on Visualization and Computer Graphics
Publisher
IEEE
Description
In cinematic VR applications, haptic feedback can significantly enhance the sense of reality and immersion for users. The increasing availability of emerging haptic devices opens up possibilities for future cinematic VR applications that allow users to receive haptic feedback while they are watching videos. However, automatically rendering haptic cues from real-time video content, particularly from video motion, is a technically challenging task. In this paper, we propose a novel framework called “Video2Haptics” that leverages the emerging bio-inspired event camera to capture event signals as a lightweight representation of video motion. We then propose efficient event-based visual processing methods to estimate force or intensity from video motion in the event domain, rather than the pixel domain. To demonstrate the application of Video2Haptics, we convert the estimated force or intensity to dynamic vibrotactile …
Scholar articles
X Chen, ZZ Hu, G Zhao, H Li, V Chung, A Quigley - IEEE Transactions on Visualization and Computer …, 2024