Conversation Balance: A Social VR Project

Date:

August 2021 - May 2022

Role:

Lead researcher, Data Collection: Survey design, User-Interviews, Qualitative & Quantitative Data Analysis, Publication Writing

Teams:

Collaborated with Engineering Team, and Project Management. Presented Project to Executive Director and User and Industry Stakeholders.

This prototype is part of a larger NSF-funded research project focused on developing VR tools to improve online meetings. I designed and research affordances that utilize VR’s unique properties to help people collaborate, balance participation, manage time, agree on decisions, follow an agenda, achieve social connection, and support ideas.

Conversation Balance: A Social VR Project
help circle

Problem

In the post-pandemic world, Zoom fatigue and remote working burnout can be seen across professional workplaces. To overcome Zoom fatigue and mitigate related challenges, researchers have suggested VR meetings as an alternative platform. VR offers rich, embodied ways of connecting people while providing a sense of shared presence that is not possible in Zoom or Face to Face meetings.

define icon

Hypothesis

In particular, we aimed at understanding whether participants appreciated the presence of this visualization, and whether it had an impact on their performance in problem solving. We were also interested in measuring other aspects of social interaction that were shown to impact group performance in completing a task, such as social presence.

alert circle

Solution

We designed a tool for VR that visualizes conversation during VR meetings. It will help people facilitate participation parity, improve group performance, and result in virtual co-presence.

star icon

Outcome

Our research was successfully published at 2022 CHI Conference Interactivity Demo and Accepted by 2022 SIOP Conference. The research has shown that our design visualization could improve people's experience hosting conversations in VR systems. Users felt the design augmented their conversation in VR to be more inclusive and balanced because the visualization feature made them more aware.

Process

01. Research Problem

Background & Inspirations

Since the spread of COVID-19, the world has pivoted to a remote working lifestyle, where people’s work lives are connected everyday through distributed technologies such as Slack, Google Meet, or Zoom.

However, having access to distributed technologies does not necessarily increase the ability for people to develop more successful collaborative professional experiences. In particular, research has shown that it can be difficult to host effective virtual workplace meetings and that collaborative success in team is more dependent on a group’s ability to work together than individual skill. How can we design a more supportive, effective virtual meeting workplace?

Literature Review

Conversation Clock Design

Since collaboration and social interaction is not a new research topic, another researcher and I conducted a series of literature review of previous social augmentations and VR research before exploring how to solve this problemOur design choices for the visualization were informed by prior research The Conversation Clock by Bergstrom et al.,

Findings

  • As the time passes, a speaker is represented in colored time blocks on the clock, visualizing how long they have been speaking.
  • The researchers noted during the study that the clock was primarily observed by listeners and not by the speaker.
  • This visualization could provide a sense of “encouragement” to engage in the conversation for participants who tend to speak less in meetings

Research has shown that greater parity of conversational turn taking is predictive of group performance,

For example, collaboration relies on turn-taking behavior and overall sense of social presence.While social presence is difficult to quantify, we can get a sense of turn-taking behavior by keeping track of speech length and number of turns that people in a group take during a meeting.

We focused on using visualization to quantify speech length and turn-taking during a conversation to enhance our Conversation Balance VR Prototype. 

Micbot

Our design was then situated in an environment that supports conversation about a particular task. We created a VR version of the Desert Survival Task (DST). This task is commonly used to evaluate group problem solving mediated communication studies, including in Tennent et al.’s work on Micbot.

02.Ideations & Design

Initial Prototype: Professional Meeting Room in VR(Unity)

Facebook Space

Our early prototype design in Unity was inspired by Facebook space's embodied avatar environment.

These are the body-storming room, with various visible props (Team Contribution). We developed a series of prototypes in Unity to visualize the conversation.

After learning from our usability and user testing. These prototypes were created by the Engineering team, which inspired the current design. I worked as the quality and assurance role to ensure the software features would run reliably.

Early Design of the Avatars in VR, each cube heads and rectangular hands, and a yellow-face indicating visual direction.

How people interacted with one another inside the VR Meeting room

  • This prototype was inspired by the social VR application Facebook Spaces.
  • We created a tray shared table in the middle to direct the user's attention towards the task.

Visualization of People's Conversation

  • This design mimicked the design of the Conversation Clock that is more visible to the players inside VR.
  • We designed our conversation visualization as a combined column of colored balls that increases in a cylinder-shape the more participants spoke. Each ball was color matched to a participant’s avatar color.

Final Design

After learning from our iterative research process, we switched to Mozilla Hubs for connectivity stability and simpler UI design. This prototype was further developed by my engineering team and I assisted as a UX Designer on the design team. We developed the final prototype design in Mozilla Hubs to visualize the conversation

The final VR prototype design after rounds of user testing. Shared tray table with numbers for "Desert Survival Task"

The tube visualizations are visualizing each speaker’s avatar color.
Example of the bars increases and float upwards as the users talk more.

03. Project Publication Outcomes

We have been successfully accepted at the 2022 CHI Conference and I have presented our demo alongside my team at the CHI Interactivity Demo in New Orlean, LA.

2022 CHI Interactivity Demo

In 2021, we were accepted at the CHI Conference: The First XR Remote Research Workshop regarding the difficulties we encountered researching remotely and I have presented our project to industry UX researchers and academia professors.


04. Conclusion & Limitations

Research Limitations

Technical issues (Hardware & Software)

  • Oculus hardware headset would loose connection or reset play area making the experiment longer.
  •  The custom coded Mozilla Hubs environment needed many rounds of iteration and testings to ensure stability in software features.

Participant Background & Remote Limitation

  • Some participant from university recruitment had prior relationships (friends, colleagues & classmates etc) with others.
  • Due to most of the study recruiting participants online, this lead to unreliability in the user's participation. E.g., only 2 out of 3 participants who signed up will show up on the day of experiment. To combat this we would always recruit more than 3 people to ensure we have the same number groups of participants.

Conclusion and Outcome

This was my first time working on a NSF funded full-scale design project, and I found it to be a valuable learning experience. The impact of our project is to increase conversation inclusivity and awareness to promote a healthier work environment. We hope that prototypes like this one can be inspiring to others working to develop future collaborative work environments in VR as well as other XR contexts. The favorite moments of the project was learning how to use Unity, Figma, Miro and Oculus Quest hardwares to communicate our findings,  to construct our designs and conduct user testings. Virtual Reality was an unfamiliar subject at first but after working on this project since 2020, I am more confident in my abilities to use these applications in future projects.

We are currently working with Mozilla to field test our design in their daily corporate meetings!

This project is in collaboration with Dr. Katherine Isbister, Dr. Josh McVeigh-Schultz, Sean Fernandes, Max Kreminski and Anya Osborne.

Research Reference:
- Kocsis, David J., et al. “Designing and Executing Effective Meetings with Codified Best Facilitation Practices.”
The Cambridge Handbook of Meeting Science, 2015, pp. 483–503., https://doi.org/10.1017/cbo9781107589735.021.
- Geimer, Jennifer L., et al. “Meetings at Work: Perceived Effectiveness and Recommended Improvements.”
Journal of Business Research, vol. 68, no. 9, Sept. 2015, pp. 2015–2026., https://doi.org/10.1016/j.jbusres.2015.02.015.