medium post background.png

Multi-App Spatial Computing

Enhancing Multi-Application Management in Spatial Computing: A UX Design Exploration

 

Project

In our capstone project for the Masters program in Human Centered Design & Engineering, Camila Proffitt, Siddhant Patil, and myself embarked on a research and design journey centered around multi-application management for XR and Spatial Computing. Our goal was to investigate the essential functionalities required for tasks such as finding, hiding, opening, quitting, and switching between different applications running or latent on computing devices, akin to the functionalities provided by macOS Finder or Windows Explorer. Our approach aimed to be platform-agnostic, encompassing various forms of spatial computing, including augmented reality and virtual reality, to provide comprehensive insights into the necessary requirements.

Challenges

  • Efficient management of applications (finding, hiding, opening, quitting, switching) in spatial computing environments.

  • Designing an intuitive interface for users at all experience levels using the diverse interactions available within spatial computing platforms.

  • Conducting user research and testing amidst the global pandemic's restrictions.

  • Ensuring our solution's applicability across various spatial computing platforms.

  • Communicating and sharing our project outcomes with the broader community effectively.

Role

  • Collaborative Design & Research: I was deeply involved in every project phase, from initial research to final documentation, ensuring our approach was comprehensive and integrated.

  • Industry Engagement: Participated in interviews with industry experts and professionals in the spatial computing field, gaining invaluable insights that informed our design decisions and approach.

  • Creative Lead: Spearheaded the design of the VR prototype, including the creation of images, videos, and voiceovers for our public-facing outputs, and handled video editing to ensure polished, engaging content.

  • Knowledge Sharing: Shared in the creation of the Medium post and YouTube video, aiming to share our learnings and insights with the wider spatial computing community and beyond. This effort was designed to inspire further innovation and discussion in the field of XR design.

Process

  • Assumptions:

    • Integrated AR and VR Applications: We assume that AR applications can be integrated into VR environments, similar to their overlay in the real world. This is based on the potential for users to multitask within a 3D space, as seen in developments like Aardvark and OVRToolkit, which suggest a unified management system for both AR and VR applications could enhance usability.

    • Convergence of VR and AR Hardware: We anticipate that VR and AR technologies will merge, leading to devices that support both experiences seamlessly. This assumption guided our project's direction to ensure its outcomes are applicable to current AR and VR headsets, as well as future XR devices.

    • Single Environment Interaction: Our premise is that users can only engage with one environment, real or virtual, at a time. Despite recognizing potential interest in simultaneous dual-environment interactions, our research did not identify a compelling use case to explore this possibility further at this stage.

  • Initial Research: Following an extensive exploration into the XR (extended reality) domain, analyzing both the capabilities and limitations of the technology, our team focused our research on envisioning the future of XR. We formulated a research question centered on the management of multiple 3D applications within XR environments: how might we manage multiple 3D applications in XR? To lay a solid foundation for our project, we established well-informed assumptions about the technological trajectory of XR, which are crucial to outline before delving into the key insights for designing within this space.

    Our initial investigation encompassed a thorough review of the current state of AR (augmented reality) and VR (virtual reality) interfaces and user interactions. Leveraging the Valve Index VR headset and the HTC Vive available in our department's VR lab, alongside various AR and VR devices provided by Pluto VR, we immersed ourselves in a broad array of software applications. This hands-on experience allowed us to compile a comprehensive database of screenshots and detailed analyses of user interactions encountered across these platforms. Additionally, we undertook extensive literature reviews and online research to enrich our understanding and gather pertinent references that informed our study.

  • Client: We initiated our research with onsite interviews at Pluto VR, a Seattle-based AR/VR company. At the time, Pluto VR had developed a product that facilitated communication among users wearing AR and VR headsets from various manufacturers. This platform allowed participants to visually interact with each other's avatars, exchange gestures, and share digital resources such as whiteboards and videos. However, as the platform evolved to include more interactive elements, the user experience (UX) began facing complexities. Introducing additional elements in a session led to issues like occlusion, where objects could block or hide others, complicating user interaction and control. For instance, if a user enlarged a shared chessboard to life-size proportions, it raised questions about how this change would appear to other participants. Would the chessboard also enlarge on their screens? And how would it affect users who preferred to focus on other elements, like a video feed or a whiteboard? These challenges highlighted the intricate UX considerations required as the software's functionality expanded.

  • Adaptation: Adjusted project approach during COVID-19 quarantine. We started out our process with the idea that we would be designing simple prototypes that we could then run usability testing with on a set of users. Everything crashed to a halt as Covid 19 started spreading everywhere. Classes moved online, and our initial thoughts of putting random people into VR headsets seemed inappropriate. We realized from our conversations with Pluto VR subject matter experts that since we were proposing to research for developers and those creating interactions that subject matter experts would be a good new focus. It would be difficult to target multiple types of headset within the timeframe, so in order to increase the amount of people we could share potential designs with and gather information from we could instead create video prototypes to show to these people remotely via video calls, explain the interactions, and get their feedback.

  • Ideation: Independently, we each drafted sketches of potential interactions, targeting the diverse interaction challenges identified during our initial conversations with Pluto VR. Following this individual ideation phase, we reconvened as a team to refine our sketches collaboratively. This review process allowed us to assess the effectiveness of each concept in addressing the identified problems and facilitating specific tasks.

  • Interviews: Throughout our project, we effectively combined feedback gathering with our research initiatives to thoroughly explore the intricacies of spatial computing. Our initial step involved discussions with employees from Pluto VR, who provided valuable insights into the latest trends and areas for growth within the industry. This early dialogue laid the groundwork for our in-depth research, during which we interviewed Subject Matter Experts (SMEs) in AR/VR software development to sharpen our focus and refine our project objectives.

    Following these discussions, we individually drafted sketches aimed at resolving the specific interaction challenges highlighted by Pluto VR. We then selected the most promising of these sketches and presented them to additional SMEs for evaluation, leveraging their feedback to gauge the practicality and impact of our ideas. This cycle of feedback and evaluation allowed us to quickly identify and correct any discrepancies in our initial assumptions.

    After developing our video prototypes, we embarked on another round of research, conducting further interviews with industry experts. This stage was crucial for obtaining detailed feedback, which we used to make iterative improvements to our designs. This holistic strategy of integrating feedback with ongoing research ensured that our project evolved, guided by expert insights.

  • Collaboration: Refined interactions collaboratively. We sketched what the interfaces might look like and then often mimed what doing the interaction would look like. This helped the others understand fully the interaction we were thinking about, and the act of physically moving like would be needed in order to perform the interaction using an AR or VR device helped us get some extremely quick thoughts on how awkward or comfortable the interactions would be. The physicality of some movements required to interact in AR and VR means that there are some interactions that should probably not be used, some that should only happen rarely, and others that are comfortable to repeat for a sustained period of time.

  • Design: Designed VR interactions and prototypes using Microsoft Maquette and Unity. Though the prototype was not interactive, these tools made it possible to portray potential interfaces and methods of interaction, and enabled me to record first person video of what I could see within my VR headset, which along with voiceover audio gave a clear image of the interactions and interface elements we were proposing as examples.

  • Documentation: Authored Medium post and produced YouTube video, as well as more deliverables for our Capstone course.

Learnings & Future

Insightful Engagement: Engaging with industry experts not only provided invaluable insights into the current state and challenges of spatial computing but also highlighted the importance of adapting research methodologies to accommodate external factors, such as a global pandemic. This experience reinforced the need for flexibility in research approaches and the value of expert feedback in shaping our project's direction.

User-Centered Design: Our project emphasized the critical importance of designing with an acute awareness of the user's physical and cognitive comfort. Through iterative prototyping and testing, we learned that the success of spatial computing interfaces hinges on their ability to provide intuitive, natural interactions that seamlessly integrate into the user's environment. This insight will guide future projects to prioritize user comfort and ease of use in the design process.

Holistic Approach: Adopting a platform-agnostic perspective allowed us to address a broader range of spatial computing environments, from AR to VR and beyond. This approach taught us the importance of designing solutions that are adaptable and inclusive, capable of catering to diverse user needs and technological contexts. It underscores the necessity for future designs to be flexible and versatile, ensuring broad applicability and accessibility.

Continuous Iteration: The iterative nature of our design and feedback process was instrumental in refining our solutions to better meet user needs. This approach highlighted the value of ongoing user engagement and the iterative refinement of designs based on direct feedback. It's a reminder that in the rapidly evolving field of spatial computing, continuous innovation and responsiveness to user feedback are key to developing effective and user-friendly solutions.

Public Outcome

Our project's culmination led to the creation of essential resources aimed at professionals in the spatial computing sphere. Through a Medium post and a YouTube video, we shared our comprehensive findings and proposed solutions, addressing the nuanced challenges of multi-application management in AR, VR, XR, and MR environments. These deliverables are designed to serve as foundational guides, fostering innovation and further exploration in spatial computing UX design. We also have additional in-depth findings ready for discussion, aimed at enriching the spatial computing field further.

 
 

YouTube Video: A visual guide through our design process, findings, and proposed solutions, enhancing understanding and engagement with our research outcomes.