Research Projects

Redirection of gesture trajectories between local and remote environments in mixed/augmented reality

The figure depicts the problem formulation for redirecting contact interactions in AR telepresence with our novel approach to solving it.
Fig. 1: The above figure depicts the problem of redirecting positional interactions in avatar-based AR telepresence systems. (a) User2 in their environment holding a cup of tea. (b) User2’s photorealistic avatar as seen by User1 wearing the AR HMD with a direct one-to-one mapping of User2’s gestures. Note the positional and gaze errors. (c) Correction of positional errors as done by current approaches, no gaze and nonverbal behavior of the original interaction is preserved. (d) Desired correction of the error (our framework), that preserves the nonverbal aspects of the interaction, that is important in establishing social presence.
The figure depicts the problem formulation for redirecting pointing interactions in AR telepresence with our novel approach to solving it.
Fig. 2: The above figure depicts the problem of redirecting pointing interactions in avatar-based AR telepresence systems. (a) User2 pointing in their environment. (b) User2’s photorealistic avatar as seen by User1 wearing the AR HMD with a direct one-to-one mapping of User2’s gestures. Note the error in the pointing gesture. (c) Correction of pointing errors as done by current approaches; body pose-nonverbal communication of the original interaction is not preserved. (d) Correction of the pointing error by our presented framework, that preserves the nonverbal aspects of the interaction, that is important in establishing social presence.

The above figures show the redirection of a user's gestures from remote to local environments (and vice versa) for avatar-based telepresence interactions. Gestural interactions can be divided into two types: i) positional interactions (Fig. 1) and ii) pointing interactions (Fig. 2). Positional interactions are interactions where the user has physical contact with an object in their environment. For example, picking up a chess piece. Pointing interactions are interactions where the user points at a particular object in their environment. Both these interactions need to be mapped onto the user’s avatar, in its environment, such that the interaction has its original intended meaning while also preserving the original nonverbal bodypose behavior. During my PhD, I proposed a multiobjective optimization approach using genetic algorithms to solve this problem for both positional and pointing gestures.

Generation of naturalistic locomotion paths between local and remote environments in mixed/augmented reality

The figure shows the case of equivalent walking paths that need to be generated due to dissimilarities between User1 and User2's environments
Fig. 3: The above figure shows the case of equivalent walking paths that need to be generated due to dissimilarities in the granddaughter, User1 and older adult, User2's environments. (a) Represents one of the likely paths User1 may take along vertex points ABCDE to sit on the sofa to chat with User2s avatar. (b) Represents the equivalent path A'B'C'D'E' in User2’s environment of the User1s path ABCDE after path deformation to reach the sofa. Note: The equivalent path A'B'C'D'E' is longer and hence the avatars speed in (b) needs to be higher than in (a), to reach the vertex E’ as the user in (a) reaches E. My solution is to use the concept of mesh deformation.

Remote mixed reality (RMR) allows users to be present and interact in other users’ environments through their photorealistic avatars. Common interaction objects are placed on surfaces in each user's environments and interacting with these objects require users to walk towards them. However, since the user's and their avatar's room's spatial configuration are not exactly similar, for a particular user's walking path, an equivalent path must be found in the avatar's environment, according to its environment's spatial configuration (Fig. 3). During my PhD, I proposed the use of the concept of mesh deformation to obtain this path, where we deform the mesh associated with the user's environment to fit to the spatial configuration of the avatar's environment. This gives us the corresponding mapping of every point between the two environments from which the equivalent path can be generated. The generated equivalent path serves as a reference for the avatar to follow. It needs to be modulated to fit naturalistic constraints, which is an active area of my ongoing research.

Redirection of locomotion paths for obstacle avoidance in remote augmented/mixed reality

The figure shows a special case of Fig. 3, i.e., the problem of obstacles in a remote-local AR scenario.
Fig. 4: The above figure shows a special case of Fig. 3, i.e., the problem of obstacles in a remote-local AR scenario. (a) shows one of the paths ABCDE the User1 (granddaughter) can take in her environment to reach the sofa. (b) shows the redirected path A'B'C'D'E' (from Fig. 3), User1’s avatar must follow to reach the sofa. However, the path A'B'C'D'E' needs to be modified to take into account the obstacle represented by the dining chairs in User2’s environment to prevent unnaturalistic interaction.

Obstacles in the context of remote-local AR scenarios are non-walkable spaces that consist of no interactable objects. Therefore, they do not serve any function from the perspective of mapping the user’s interaction from their environment to that of the avatar’s. However, the presence of obstacles can cause problems during mapping of locomotion between environments. First, obstacles can be of any irregular shapes and can occur in any configurations. Second, obstacles can be in either the avatar’s or the user’s environment and may not have any corresponding equivalent objects between the environments. This non-correspondence between the user’s and avatar’s environment can make the computation when finding equivalent paths complex. Moreover, we have presented the case for only two environments, i.e., the user’s and the avatar’s. However, it is likely that multiple users would be present in the interaction, where the equivalent locomotion mapping would get even more complicated. Thus, it is of great importance to automatically redirect walkable paths around the obstacles. During my PhD, I proposed an ellipse based solution that can modify the path smoothly around any irregularly shaped obstacle.

Older and younger adult perceptions towards HMD based telepresence interaction in collaborative AR

AR telepresence system
Fig. 5: An example setup of a sample activity. An older adult, User1, is able to experience her granddaughter, User2, through her photorealistic avatar in her environement (left). Similarly, User1 is represented in User2’s environement through her photorealistic avatar (right). Users’ interactions have been mapped to their respective avatars.

The aim of this project is to obtain the perceptions of older adults, with younger adults being the control when interacting with a photorealistic avatar of a remote user. In addition to their perception, we are also interested to know whether full body emotions and gestures expressed through the avatar are discernable to the older adults. The system involves real-time bidirectional communication of the participants with a remote user and was developed using the Unity game engine. The facial and body expressions are captured and transferred across the network using a ZeroMQ distributed computing architecture.

Non-invasive sensing of vital signs from recliner chairs

A sample of the signals obtained from the chair accelerometer and the detection of those signals.
Fig. 5: A sample of the signals obtained from the chair accelerometer and the detection of those signals. (a) Ballistocardiogram (BCG) signals obtained from the chair accelerometers. Overlaid on the BCG signal is the output of the peak detection algorithm. Ground truth signal from piezoelectric sensor (top) (b) Ground truth respiration signals obtained from the chest band (top), signals obtained from the chair accelerometers (middle, bottom). Accelerometer placed under the recliner captures heart signals more accurately where as accelerometer placed on the side cushion captures respiration signals more accurately.

In this project, I developed a sensor system for recliner chairs that can extract the heart and respiration rates of the occupant in real-time. The system uses two accelerometers placed strategically to capture these vital signs noninvasively and without direct contact with the body, while at the same time, being hidden from view. The system was tested with 45 subjects, having an average age of 78.8 (S.D. = 12.5) years, for both upright and reclined configurations of the chair. We also tested the system on 6 different types of recliner models. The ground truth signal for the heart and respiratory rates were obtained using a piezoelectric finger transducer and thorax chest belt respectively. The mean heartbeat error for 45 subjects was 0.6 ms with an average error rate of 3.6% (p value = 0.00081, significance level = 0.05). Similarly, the mean respiratory breath error was 4.2 ms with an average detection error rate of 6.25% (p value = 0.032, signifcance level = 0.05).

Iterative participatory design of collaborative AR activities for older adults at long-term care (LTC) settings

An older adult picking up a checkers piece in AR.
Fig. 6: An older adult picking up a Checkers piece in AR. An iterative participatory design was used to understand the needs of older adults for developing collaborative AR activities for them.

This project involves developing collaborative activities in remote augmented reality that are adapted to the needs of older adults at long-term care (LTC) settings. The long term goal is for their friends and family to remotely interact with them via 3D virtual visits in AR through their photorealistic avatars. The hypothesis is that the immersive and collaborative nature of the interaction with loved ones will promote social connectedness and mitigate loneliness. The project follows an iterative approach of prototyping and refining activities in AR, with inputs from all involved stakeholders, including staff and family members of the older adults at LTC settings.