A Full Body Avatar-Based Telepresence System for Dissimilar Spaces
We present a novel mixed reality (MR) telepresence system enabling a local user to interact with a remote user through full-body avatars in their own rooms. If the remote rooms have different sizes and furniture arrangements, directly applying a user's motion to an avatar leads to a mismatch of placement and deictic gesture. To overcome this problem, we retarget the placement, arm gesture, and head movement of a local user to an avatar in a remote room to preserve a local user's environment and interaction context. This allows avatars to utilize real furniture and interact with a local user and shared objects as if they were in the same room. This paper describes our system's design and implementation in detail and a set of example scenarios in the living room and office room. A qualitative user study delves into a user experience, challenges, and possible extensions of the proposed system.
READ FULL TEXT