So you want to build a mixed-reality experience? You’ve seen the amazing experiences demoed by Microsoft, Google, Apple, and the many other players racing to join the mixed-reality club, but don’t know where to start? Well this guide is for you!
For this article we will be working with Unity primarily as it’s built in hooks for the Hololens tremendously simplify the development process. Due to the nature of early-stage API technologies being in flux, this document will focus more on patterns than explicit code samples.
Setting Up a Project
For development you will need a windows machine with the latest version of Windows 10 installed and both Visual Studio and Unity installed andconfigured. We have been using the Windows Surface line of machines and they have done a great job. You can try bootcamp or virtualbox, however we have not tested this in such environments. For Hololens development you will also need to install the relevant SDKs. For a full walkthrough follow thisguide. Finally, you should make use of the Microsoft Hololens Toolkit (Holotoolkit) library which provides a set of APIs and reusable components. The latest release of Holotoolkit which can be found here (make sure you download the *.unipackage file).
Now that you have downloaded everything, start a new Unity Project. Give theUnity project by a name and leave all the settings to their defaults.
Once you have a new project, you will want to import the Holotoolkit assets by going toAssets -> Import Package -> Custom Package and navigating to where you downloaded the *.unipackageHolotoolkit library. Make sure you import the library and not the tests package.
Upon succesful import (your instance of Unity may restart) you will see a new menu option become availible in Unity, Holotoolkit.
Ensure you have the Windows Metro SDK installed by going toFile -> Build Settings and seeing if the Windows Store option is availible. If not, click on install SDK and follow the installation steps.
Back to the main Unity screen, we can now create our Scene. A Scene is just a 3D space much like how a page is to a website, where we can add our components and objects.
To save our scene, just click File -> Save Scene As and give it a name. I tend to Save the Scenes under a new folder I create called “Scenes” and call the main scene “Main”.
Next, let’s apply the necessary Unity settings for a Hololens project. Look at the top bar for the menu item called HoloToolkit then click Configure -> Apply HololensProjectSettings. This will cause Unity to restart, and then click on, go to the same menu location and select Apply Hololense Scene Settings and then Apply Hololens Capability Settings. What this will do is setup your project to make use of the Hololens Camera, Cursor, and Input Manager prefabs (more on those later). Furthermore it will adjust the project build settings to those recommended for Hololens.
Verify settings were applied
You can confirm the setting changes were applied by going to Edit -> Project Settings -> Player.
Building and Deploying to the Device / Emulator
The easiest and most rapid way to get something up and running on your device is to use the Holographic Tab which gets added to Unity once you apply the Holotoolkit project settings. This lets you stream the unoptimized application from Unity to your headset and see in real time how your program runs and apply real time changes. You can also build a release Visual Studio Solution from Unity to deploy on an emulator or headset.
In order to use the Holographic Tab, let’s first make sure we install the necessary program on our Hololens, the Holographic Remote Player. Download this app directly to your Hololens and open it up.
Once you open the app, you will see an IP address. Go ahead and type that IP address into the Remote Machine field and press Connect under the Unity Holographic Tab.
Once successfully done, you will see your scene mirrored from Unity onto the Hololens. You can now add and manipulate any object in the Scene and see those changes occur in real time. Do note though, changes will be reverted to the state of the application prior to starting the preview. This is a great way to test and debug your application, but be wary this will not necessarily reflect the performance or functionality once you build for release.
The above setup is great for debugging and setting up a scene, but once you are ready to build a release version to deploy on a headset you will need to build a Visual Studio Solution. To do so, lets click on Holotoolkit -> Build Windows. Once there select Build Visual Studio SLN.
This will create a folder in your Unity Project directory (In our case called WindowsStoreApp) which you can open in Visual Studio. From there you can go to the Project Settings by clicking Project -> <Project Name>Properties. Ensure that you are targeting x86 architecture and that you are deploying to Remote Machine (or emulator). Once that is done you will see in the Debug section of the project settings a remote machine IP address. Write down the same address you wrote into Unity and save it.
Once ready, click Debug -> Build Without Debug which will compile a release build and load it on your hololens. This can take a while (on the order of a few minutes), but it will build and deploy the application directly to your Hololens, just make sure your Hololens does not fall asleep :)
The Basic Building Blocks
So what are the primitives to a Unity-enabled Hololens project? There are at the bare minimum three primitives everyone should be aware of.
- GameObjects represent an entity in the scene. These can be basic shapes such as squares, spheres, and pyramids, or they can be empty dimensionless items which exist purely as containers / wrappers for other objects.
- Manager Scripts control the behaviral logic of your GameObjects.
- Textures give color and details to the GameObjects, they can be images, videos, or simply colors.
- Prefabs are reusable components made up of GameObjects, scripts, textures, and more.
While in our case, we had the simple case of working with one scene, the following should be generalizable to multiple scenes.
The project structure we settled on was having dimensionless GameObjects called “Managers” which hold the Manager Scripts to register, monitor, and dispatch project-level event.
An example of one is our “AppManager”. We ensure this script is a singleton and instantiate it on app launch. This is where we maintain the state of our application and instantiate all other GameObjects.
The unidirectional Redux state pattern makes the most sense in this type of project so working towards this pattern will help maintain the project scalable and maintainable. The InputManager Prefab you find in the Holotoolkit exposes a series of event for actions such as Air-Tap, Clicks, and Gaze. Below you can see how to subscribe and act on these events.
An Example Project
This is a quick project we put together here at the Foundry. It was a proof of concept / exploration of AR applications in real world settings such as the grocery store. More information can be found here.
The next steps will be to make use of the updated Space Understanding API to automate the placement of the holograms and to explore using computer vision APIs such as Vuforia to explore more interesting tasks such as inventory logging and item detection.
Github repository: https://github.com/cornelltech/arlane
- Unity Official Tutorials Docs: The official learning material put together by the people at Unity. If you have never worked in a game-engine/3D environment you should start here to get your head around a typical Unity project’s structure and components.
- Microsoft’s Unity-Holotoolkit: An invaluable set of APIs and resources compiled by Microsoft’s various studios and teams. Documentation is sparse, however the most relevant files are contained inAssets/HoloToolkit/
- Microsoft’s Mixed-Reality Academy: The official docs and tutorials for building Hololens applications by Microsoft. As of this post’s writing this resource is out of date with the official libraries, however I am sure Microsoft will update its documents once the API is more stable.
- Foundry / Cornell Tech / AOL Collaboration: This is the project page for the AR exploration we did in collaboration with Cornell Tech’s Connected Experience Lab and AOL.