Overview
Background
In October of 2017, the Windows 10 Fall Creators Update introduced the new Windows Mixed Reality (WMR) immersive headsets and added virtual reality (VR) support to Windows. This was Microsoft's first foray into the VR space after shipping the Hololens a year prior.
I was a part of the Mixed Reality design team that had just finished work on the Hololens, and shortly thereafter transitioned into pre-production efforts for VR.
My Role
Within the Mixed Reality design team, there was an even smaller prototyping team. As a member of that team, I was responsible for creating prototypes that explored new designs for the Mixed Reality platform. One of our challenges was to adapt the Holographic Shell on the Hololens, over to VR headsets running Windows.
Projects
I've selected several prototypes from my time on the Mixed Reality design team that best represent my work.

VR Locomotion
Problem
We needed to create our own version of 6DoF (six-degrees-of-freedom) controller teleportation for the WMR shell and all of our prototypes. Our intent was to mimic Valve's VR locomotion methods and build a Unity demo that'd run on SteamVR.
🎯The Goal: Create a functioning teleportation system and start to explore different ideas for locomotion in VR.
Prototype
This was an essential prototype because we needed the teleportation mechanics to traverse space more easily in VR. While smooth movement (or walking) was possible, it would also cause motion sickness problems. The prototype features:
- Arc teleportation, enabling users to teleport onto flat surfaces located above them.
- The ability to target and teleport to objects in the distance.
- Visual and haptic feedback when teleportation cannot be used, and the option to rotate your orientation while teleporting.
Results
The teleportation system created for this prototype was reused in all of our future VR prototypes, and the core functionality made its way into the shell. After this prototype, there were several others that pushed the teleportation mechanics further, and the final version received a great deal of visual design polish.

Portal VFX
Problem
When users first launch the Windows Mixed Reality Portal app, they are taken through a device setup process and an introductory tutorial, called the "First Experience" or the out-of-box-experience (OOBE).
The First Experience was designed to have a final portal that would transport users to their home environment (known as the Cliffhouse). I was asked to work on pre-visualization for this final portal.
🎯The Goal: Create several different VFX explorations for the First Experience's final portal.
Design
I collaborated with Scott Petill, who designed the entire user flow of the First Experience. It was meant to be a self-contained experience that teaches users how to interact and get around in VR.
Prototype
I created a few different visual effects (VFX) concepts that experimented with the look of the First Experience's final portal. I also worked on the interaction and animation that occurs when a user walks into the portal.
Results
The First Experience that's currently accessible in the Mixed Reality Portal has a final portal that's fairly reminiscent of the ones that I created for the prototype. You can see this final result in the video below:

Instinctual Interactions
Problem
The purpose of this prototype was to evaluate the use of articulated hand tracking to touch and interact with virtual objects. We wanted to figure out how much feedback, or assistance, the shell should give when placing objects, and what level of precision we could expect from hand tracking.
🎯The Goal: Create a 3-game hand tracking prototype that has difficulty modes with varying amounts of feedback and assistance.
Design
The design for this prototype was created by Scott Petill. It consists of three games with easy, normal, and hard modes that respectively adjust the amount of feedback and assistance given to users. Each game was designed to support grabbing, holding, releasing, and nudging (pushing) objects.
Interacting with objects, or holding them near their objective, triggers both visual and audio feedback. To assist users, objects that are released within proximity of their target destination automatically snap to it.
Prototype
We used the Leap Motion for hand tracking in this prototype because it already had support for Unity. I modeled and set up all of the assets to provide an adjustable level of feedback (color and audio) and assistance (object snapping) when manipulating objects.
The prototype is made up of 3 different mini games that require increasing levels of hand precision:
- Object Evaporator - Grab and drop objects into the bin.
- Pyramid Builder - Stack puzzle pieces in the correct order.
- Shape Sorter - Place shapes in their corresponding receptacle.
Results
The prototype was handed over to user research, and learnings from that carried over to the hand tracking and instinctual interactions developed for the Hololens 2. Researchers found that:
- While picking up palm-sized objects is fairly easy, hand tracking struggles with grabbing smaller objects.
- Feedback outlines help when hand tracking fails to register a grab, or when objects are very close together.
- Placing objects with precision can be frustrating, but generous snapping distances help alleviate that.

People First Menu
Problem
This project was started during a time when the team was working on multi-user experiences. There was a desire to revisit the shell's UI, which just showed apps, and expand it to include contacts (people or friends), events (meetings), places, groups, and invites. We were asked to take the Start Menu and push it in a new direction, where people, or social interactions, were the primary focus.
🎯The Goal: Design and prototype a new 3D Start Menu that expands upon its base functionality and puts people first.
Research
Before designing the new menu, I spent some time gathering references and looking around to see what concepts other teams had already come up with.
Ideation
We created a number of sketches and mockups trying to figure out what the menu should look like. Special attention was paid to the layout of the contacts list. We also began to lean towards encapsulating the UI in volumetric (3D) bubbles.
Design
The People First Menu was designed around the friend invite scenario, where a user can invite their contacts into a group, change the group's current activity (app or place), and send notifications to group members.
I was responsible for designing and modeling the menu. I wanted to embrace VR by creating a more volumetric UI that supported direct interactions. Additional effort went into simplifying the menu and aligning it with the visual style of the shell.

Prototype
I worked with Jonathan Palmer on this prototype. It was built as a proof of concept, where there's no real data behind its functionality. Our goal was to see how the volumetric UI looks and feels to interact with. Users can send invites to people by dragging and dropping them into an activity, meeting, or group (the physical tray).
Results
We hoped to use our learnings from this project to guide the shell's UI redesign. Unfortunately, most of our team's multi-user efforts never went anywhere, but I did eventually move over to the AltspaceVR (a social VR app) team, where I was able to have an impact on their UI instead.

Desktop Portal
Problem
Another team was working on a new mixed reality app that would be capable of streaming a user's desktop to a window (i.e. portal or slate) in VR. They wanted to see how it felt to move mixed reality apps between the 3D environment and the 2D desktop window. I was asked to turn their concepts into a working prototype.
🎯The Goal: Create a prototype that allows users to move apps between the mixed world and the virtual desktop portal.
Design
I collaborated with Noe Barragan, the designer on this project. In the designs, if an app is dragged out of the desktop window, it will be converted into a 3D app slate. If an app has no VR capabilities, it will remain within the virtual desktop and provide negative feedback. The opposite is true for VR apps that don't support the desktop.
Prototype
This prototype imitates some of the virtual desktop's functionality. Users can spawn apps into the mixed world and freely move them in and out of the desktop window. Certain apps were set up as edge cases, where they're unable to enter or leave the desktop.
Results
The team working on the virtual desktop app were able to make use of my prototype, but they never ended up shipping the product. Fortunately, other solutions already exist that let you see your desktop in VR, like the Virtual Desktop app.


Conclusion
Reflection
Being a part of the Windows Mixed Reality design team was a great experience. It was nice to work on VR again after spending two years working on the Hololens. I enjoyed having the opportunity to create a variety of prototypes, each exploring some sort of new interaction.
Ultimately, I was able to help the team ship the WMR shell as a part of the Windows 10 Fall Creators Update.
Takeaways
- It's helpful to build VR locomotion that supports both teleportation and walking or dashing around.
- Motion controllers and hand tracking allow you to create more natural interactions that people intuitively understand.
- VR is a relatively new space, so why not break away from what's already been done and try something different?
Links
Credits
Engineers
- Patrick Sebring
- Joshua Neff
- Wanni Busch
- Kevin Katona
- Jonathan Palmer
Designers
- Scott Petill
- Jenny Kam
- Noble Woods (Me)
- Noe Barragan
Project Managers
- Maria Cameron
- Miron Vranjes
- Chaitanya Sareen
Artists
- Rudy Vessup