bimvar architecture 123.jpg

VR Architecture Collaboration Tool

 

6/7 of the BIMVAR Team

Project

During the two-day VR hackathon, our team set out to create a powerful tool for detecting clashes in commercial architectural design that enhances the process and accelerates decision-making. Our goal was to create a tool that would enable users to view their designs in a more immersive and interactive way, allowing them to identify potential clashes more easily and efficiently.

Challenges

BIMVAR: Building Information Modeling for VR and AR

BIMVAR: Building Information Modeling for VR and AR

Ian Working in Unity

Ian Working in Unity

The HTC Vive Virtual Reality Headset and Controllers

The HTC Vive Virtual Reality Headset and Controllers

Task Divisions

  • Determine how to assess the efficacy of a new architectural modeling tool for a new medium

  • As a team of seven without prior experience in Unity or virtual reality development, we had to work hard to deliver meaningful output and results

  • Create something compelling for end-users and attendees of the event

  • Create an app that enables users to communicate and collaborate over the network using multiple VR devices, as well as with AR devices like the Microsoft Hololens

Role

As a UX designer, I played a crucial role in this project. As the de facto Unity expert, I was also responsible for the Unity master scene setup, 3D modeling, integration, and technical wrangling, including troubleshooting issues with Windows, Unity, plugins, and VR hardware and related firmware and software issues.

Process

Teammates

Teammates

Background: Designing and engineering large architectural structures with multiple floors involves various vendors supplying 3D models that can overlap the intended physical volume of the building. For instance, one engineer could be working on plumbing systems and designing pipe routing horizontally across floors and vertically throughout the building. At the same time, a heating and air conditioning engineer could be working on corresponding vent and duct-work. These scenarios present numerous opportunities for unintentional design overlaps that can hinder installation processes.

Physical clashes can also occur when building something in a different location than originally planned. For example, installing an electrical conduit one foot to the left of its intended location can prevent the installation of water pipes where they are supposed to go.

Resolving these conflicts or clashes before construction starts is essential. Failure to do so could lead to budget overruns and delays. In a large building, designers may have to handle thousands of clashes during a project. A collaborative tool that enables designers and contractors to resolve these points while viewing the space remotely could save significant amounts of money. The design, engineering, and management time for each clash can quickly run into thousands of dollars.

Process Start: We began by searching for existing architectural content to use for the project. Two members of our team were already working on architecture with BIM systems, but their data was proprietary. After finding suitable content, I focused on importing it and getting the scale of other objects created in 3ds Max and imported from internet downloads to match. Since scaling and measurements in Unity are different from architecture software, we had to develop a process for matching the scale of imported elements for consistency and sanity purposes. Once we had content to view, we could determine what features to add, starting with the ability to move around within the model.

Throughout the project, we divided into 2-4 teams within our 8-person group. One person worked on the logo, while another focused on project management and sought out information or equipment for us. One team worked on getting the main project to work on the Microsoft Hololens, but they didn’t manage to get a functional app working for demo time.

Interface Ideas

Regarding the interface, I mostly worked on 3D modeling potential interface elements to have floating in space near clash points or menus to bring up at the click of a button. Since it was early days in VR, I had to try many variations, put on the headset, and make changes. In general, people tend to have menus float in place, but for our purposes, it made more sense to attach some of the menu system to the controller to act upon it like a virtual tablet. We put our lists of items and the ability to tap these to go to the hot spot, as well as view audio recordings about these points.

At a certain point, someone from Amazon showed up with a stack of Amazon Alexa hardware units, and another team member and I figured out how to set up voice menus to hide and show different layers of the 3D model. We wanted to have duplicate systems for some things, and model manipulation made sense for this. Saying “Alexa, hide layer mechanical” was easier than opening a menu, selecting an option, and then possibly having to close the menu again. The premise behind this capability is that a large architectural model is extremely complicated, and it’s difficult to find and see things. It’s very helpful to be able to simplify the model on the fly to inspect the part you are interested in. It was impressive that this worked as well as it did in the hackathon space with all of the noise and echoes. Handily, the way Unity works, we were able to use the same system to show and hide layers by voice as using the menu system so we didn’t have to create two different systems for the two different types of menus.

3d Located Audio & Text Notes

Project Overview Navigation

During the development of the project, I worked on several features to enhance the collaboration experience. One of these was a 3D pointer that could be placed in space to indicate what collaborators were talking about in the app. Additionally, we implemented sticky notes that could hold audio or text information based on a specific point in the model. For this, we designed a note component to hold the information and a pointer component to remain locked to the item being pointed at.

There were multiple navigation interfaces that we designed. The first was a laser attached to the controller which could be aimed at a node to move to, and the user would be transitioned to that location with a short fade-out/fade-in effect. When the user arrived, they would be facing the same direction as before, giving them a sense of continuity.

Using the headset.jpg

The second navigation interface was a list that could be brought up using the controller. Users could scroll through a list of nodes and use them as navigation links. This method was less jarring than the first one, as the user remained looking at the menu after reaching the destination.

We had planned to implement a third navigation method, which involved separating the 3D model into different floors. This would allow users to zoom in on a particular floor and click on a specific point to move to that location. However, due to time constraints, we were unable to develop this feature. It would have required significant development work to automatically split the floors apart, and this was not feasible within the hackathon's timeline.

Throughout the development process, there were instances where sub-teams finished their work and wanted to see it incorporated into the main project. In these cases, I would find the relevant code and copy it over to the main file. However, sometimes the new feature would break something else, or additional work was required to integrate it into the main file.

As I am not a developer, I would sometimes encounter a point where I knew exactly what I needed to happen with an interaction or element, and would describe it to one of the team members who had coding experience. They would then work with me to code up that piece in C#, with them driving for that portion. Over time, I became more proficient in Unity features and was able to find code that was close to what I needed without requiring much assistance.

Finally, having a project manager on the team proved to be beneficial in many cases. He helped us structure our time and efforts, tracked down information, hardware, and other people to assist us with current tasks.

Learnings & Future

  • HTC Vive System Only Released Weeks Prior: This virtual reality hackathon took place just a few weeks after the release of HTC’s Vive virtual reality headset and system. Despite my past experiences with virtual reality hardware, which had made me ill, I decided to build a PC specifically for working with this new system, solely based on my impression of an early demo. As a result, my familiarity with the system and ability to learn quickly made me the de-facto lead developer on our project, even though I lacked formal development skills. This experience taught me that understanding how a system works and how to achieve results is often more important than having specific skills. The ability to learn and understand new technology is a valuable skill that has proven useful on many projects.

  • Know Your Resources: Since everyone on our team was new to Unity and virtual reality, we didn't know where to look for resources. After the event, I found a couple of libraries that could have made it much faster to develop our project. It would have been helpful to have a list of resources available at the event, though it's possible that such lists existed but we didn't know where to find them. This experience highlighted the importance of ensuring that everyone on a team is familiar with the project they're working on, as well as the internal and external resources available to them.

BIMVAR Screen

BIMVAR Screen

Outcome

In the Fall of 2016, I collaborated with a team of 7 to create an architectural collaboration tool for the HTC Vive at the Seattle VR Hackathon. Our focus was on Building Information Modeling (BIM) and ways to collaborate on clash detection and resolution for large scale construction projects. We developed a hand-controlled system for navigation using HTC's Vive controllers and a vocal command system using Amazon's Alexa technology.

Our demo was very successful, and it indicated to many people that high-end VR could be used for making tools, not just for gaming. Those with domain knowledge in architecture, construction, or engineering found our project eye-opening and an excellent conversation starter. We even had potential customers interested in the end product at the end of the hackathon, with people asking if we had software available for purchase.

Inspired by this interest, a few members of the team, including myself, continued working on the project to make it a reality. I presented our project at the 2016 Immerse conference, where we showed our demo in the demonstration hall next to companies valued in the millions.

We also demonstrated our project at the following Seattle VR Meetup. One of the most fun and rewarding parts of this project has been seeing people visibly come to the realization that virtual reality is not just for gaming but also has practical uses. Many people who viewed our demo at the hackathon or later at the meetup were shocked that they had never thought of VR as something that would be useful, not just entertaining.

As a designer, I have many thoughts on possible uses for VR, particularly for designing and prototyping objects and spaces. At the Immerse conference, I shared my favorite line: everyone is looking for the “killer app” for VR, but I find VR to be the “killer tool” for designing and prototyping things that do not yet exist. Therefore, I believe that design-related apps in VR have the potential to be the “killer apps” for the medium.

BIMVAR Sightings

BIMVAR Hackathon and Product Footage

BIMVAR - Building Information Modeling for VR and AR. An architecture collaboration tool built during the 2016 Seattle VR Hackathon IV. (For more information choose Show More below) Collaborate with stakeholders in multiple locations on clash resolution and other issues before or during construction.

BIMVAR Information Screens

BIMVAR Features

Bimvar specs.jpg