We’ve begun to explore potential use cases for virtual reality, and decided to start with creating a virtual office tour – which we believe can be an effective recruitment tool for new employees.
While we’re just beginning to realize the true value of augmented reality/virtual reality technology (AR/VR), we’re putting our brightest minds to work on real-world applications.
We made our initial foray into the world of AR with a simple mobile AR app for Capital Bikeshare in Washington, DC. The app presents users with an overlay depicting the direction and distance of the nearest Bikeshare stations, depending on where they are standing. As the user changes position, the overlay reflects that change in real-time. For more on that AR effort, read our notes here.
We decided to work on a basic VR use case for the technology as a follow on — a virtual tour of our Louisville office space.
We wanted to make this an approachable project from a cost perspective so we used Google Cardboard. We began with the Android SDK but quickly moved on to the Unity3D framework for a number of technical reasons that are explained below.
The Android SDK wouldn’t allow us to add a way for a user to leverage the buttons on the VR viewer. Because an office tour involves moving between rooms, we needed the buttons on the top of the Google Cardboard to allow users to move between areas with a button click. The primary issue was that the Android SDK’s VR view did not allow us to add additional 3D objects to the resulting stereoscopic rendering. We tried using a graphic library to draw the 3D elements, but that only complicated the matter because the shapes and images were placed on top of the scene rather than within it. For example, it was not possible to place a button near the entrance of the Mission Data office or the doorway into our fishbowl office. The navigation images were at the top of what the viewer was witnessing and there was no way to press a button to indicate a desire to move towards or into that newly viewed space.
Switching to Unity provided much more flexibility in overlaying additional 3D objects onto the resulting scene. With Unity, we constructed a sphere where the user’s perspective is centered inside. We then overlaid different scenes for each room within that sphere.
Once the sphere was constructed, we proceeded with building out the rest of the tour. We stitched all of our images together and added navigation properties to buttons at various places of interest. Unity enabled us to place navigation elements into the scenes so that they overlaid the resulting stereoscopic projection correctly. We used arrows to show where a user could move, and a floating Mission Data logo to indicate that the user could reveal information about an important object within the space. When the user gazes at a navigation arrow, it turns green and grows larger. A white reticle is presented as well (think small white dot) so that the user clearly knows they are on top of a navigation, or informational UI element. Users can tap the Cardboard button to either be navigated to another space, or view more information about an object in the room. Â
Of course, the best way to experience our VR tour would be to view it through the Google Cardboard in person, however, we will do our best to demonstrate it with images. Our VR tour starts right outside of our Louisville office near the fountain of the main entrance. You can pan in all directions, including up at the sky or down to the ground.
From the fountain, you enter the building by revealing and engaging with a navigation arrow. You are then taken to the center of our office. As you pan about, you’ll see more directional arrows to navigate our space. As you explore our space you’ll see the Mission Data three dot logo, which when activated will reveal information about items such as our 3D printer or our coffee maker. There are other interesting tidbits provided throughout the tour.
Building this VR tour has taught us some useful fundamentals in developing for virtual reality. Finding the right mix of development standards, SDKs, user interface techniques, and imagery wasn’t trivial. This initial experiment in VR has helped us better understand how we might construct more elaborate presentations in a 3D environment.