[SCA 2022] Motion Matching to Animate Avatars for Consumer-Grade VR Devices

Jose Luis Ponton
Jose Luis Ponton
1.3 هزار بار بازدید - 2 سال پیش - Project Webpage:
Project Webpage: https://upc-virvig.github.io/MMVR

The animation of user avatars plays a crucial role in conveying their pose, gestures, and relative distances to virtual objects or other users. Consumer-grade VR devices typically include three trackers: the Head Mounted Display (HMD) and two handheld VR controllers. Since the problem of reconstructing the user pose from such sparse data is ill-defined, especially for the lower body, the approach adopted by most VR games consists of assuming the body orientation matches that of the HMD, and applying animation blending and time-warping from a reduced set of animations. Unfortunately, this approach produces noticeable mismatches between user and avatar movements. In this work we present a new approach to animate user avatars for current mainstream VR devices. First, we use a neural network to estimate the user’s body orientation based on the tracking information from the HMD and the hand controllers. Then we use this orientation together with the velocity and rotation of the HMD to build a feature vector that feeds a Motion Matching algorithm. We built a MoCap database with animations of VR users wearing a HMD and used it to test our approach on both self-avatars and other users’ avatars. Our results show that our system can provide a large variety of lower body animations while correctly matching the user orientation, which in turn allows us to represent not only forward movements but also stepping in any direction.
2 سال پیش در تاریخ 1401/05/20 منتشر شده است.
1,319 بـار بازدید شده
... بیشتر