Hololens Hero Image

Hololens

AR prototypes exploring core interactions within the Hololens shell.

Company

Microsoft

Duration

2 Years • Nov 2014 - Aug 2016

Role

Prototyping, Gameplay Programming, Animation

Tools
Unity
Unity
Maya
Maya
Photoshop
Photoshop
Adobe Illustrator
Illustrator
01

Overview

Background

The Microsoft HoloLens is a mixed reality headset that allows users to see and interact with holographic content. When wearing a Hololens and experiencing "mixed reality", you're able to see both the real world and virtual content at the same time. It's essentially augmented reality that can read your surroundings and add digital objects to it.

The first HoloLens Development Edition shipped on March 30, 2016. I joined the Hololens design team in 2014, a couple years prior to the product's announcement.

My Role

I was one of the designers responsible for building prototypes to validate designs for the Windows Holographic Shell (or Holographic OS). Our core prototyping team was focused on iterating quickly and evaluating a variety of design solutions.

Much of what exists in the Hololens shell passed through prototyping and user research before moving on to the visual design and engineering teams.

Hololens #1
HoloLens - Marketing image
Hololens #2
HoloLens - Marketing image #2
Hololens #3
HoloLens - Marketing image #3
Hololens #4
HoloLens - Marketing image #4

Projects

I was involved in a number of prototypes during my time on the Hololens design team, but I've chosen just a handful of the ones that best display my work.

02

World Placement

Problem

App placement on surface reconstruction (a 3D map of the environment) was initially rough and inaccurate. We needed to fix it in order to make placement smoother and easier to do.

🎯The Goal: Improve app placement on surface reconstruction (i.e. spatial mesh) and recreate app behavior from the shell.

Design

Before we could really test the placement logic, I needed to recreate parts of the Hololens shell in Unity. This involved reconstructing the start menu, slates (2D app windows), livecubes (3D app launchers or holograms), and immersive apps (virtual environments).

Once they were built, we attached a tag-a-long behavior to the start menu and apps, which caused them to slowly follow your gaze direction and adjust to the surfaces behind them.

Prototype

SR Placement

I worked alongside other engineers, like David Calabrese, who did most of the heavy lifting for the math involved in app placement along surface reconstruction (SR). Together as a team we:

  • Smoothed app placement against rough, jagged spatial meshes.
  • Configured the bubble where an app sits if there is no SR.
  • Added angular scaling, where the relative size and resolution of an app is preserved regardless of your distance from it.
Prototype

Suspended Apps

One part of recreating the Hololens shell was getting apps to behave like they do in the actual shell. In the real shell, only one app can be active at a time, and clicking on an app activates it while suspending all other apps.

In computing, a "shell" is the outermost layer around an operating system, otherwise known as the user interface.

Prototype

Launching Apps

In the shell, launched apps follow your gaze until you tap to pin them in place. When a pinned app is selected, it will either open the app within a 2D window, or take over your space with an immersive app view.

Results

The new methods we developed for app placement and scaling eventually made their way into the product, and the replica shell we created was reused across future prototypes.

03

Content Scrolling

Problem

The Hololens shell already had a default mode of scrolling called "joystick" scrolling. We needed to recreate this form of scrolling in Unity for user research, and were also asked to explore alternative scrolling methods.

🎯The Goal: Explore alternative types of drag based scrolling. Compare the scrolling methods in a user research study.

Design

The Hololens is capable of using voice commands, head gaze, controllers, and hand gestures for input. Any app or element can be selected by air tapping your fingers (equivalent to a mouse click). We extended that air tap further by allowing users to hold the gesture and drag their hand to scroll or manipulate objects.

The concept below (created by Scott Petill) shows what joystick scrolling looks like, where the scroll direction and speed is determined by the offset of your hand from its air tap origin.

Prototype

Scroll Velocity

We introduced a new form of scrolling based on hand movement and velocity, called "drag" scrolling (similar to fling scrolling on touch screens). We added this to a fake web browser and a map that you could pan (both vertical and horizontal scrolling). The 3rd window used the same core hand tracking system, but for drawing instead.

Prototype

User Research

The primary purpose of this study was to compare joystick and drag scrolling to determine which method users preferred. We worked with user researchers to create a survey and track metrics that could inform us on which scrolling method was faster for a user.

Results

While drag scrolling did not become the new default method of scrolling, it did eventually make its way into the Hololens 2 in a different form. On the Hololens 2, users can scroll content by swiping ("dragging") on the surface of the content with their finger.

04

Portable Workspace

Problem

The "Portable Workspace" is actually a collection of related prototypes. At the core of it is the goal of proving out the "carry mode", which would later be renamed to "interactive float" or "follow me."

Carrying an app makes it follow the user as they walk around. An app that's being carried can still be interacted with, even while on the move. On top of this idea of carry was the need for app management via some sort of App Switcher, or what we called the "Summoner."

🎯The Goal: Reintroduce and improve Carry Mode to demonstrate its utility and build the Summoner to manage opened apps.

Design

The Summoner was designed for easy access to and control of apps. When a user opens the Summoner, they will see the app switcher menu showing all of the apps placed ("opened") in the world.

Another concept was the idea of separate home and work spaces, where you could summon an app placed in the other space or send it back ("return" it).

Summoner - Design
Summoner - Design (by Jenny Kam)
Prototype

Summoner

The Summoner prototype added the app switcher menu, with support for app placement and management. From this menu, users can "summon" or close an app from afar. Summoning an app calls it to you, where it can be interacted with, placed, or "returned" back to its original location.

Prototype

Interactive Float

For the Interactive Float prototype, most of my time was spent creating semi-functional Netflix, Holograms, Edge Browser, and Outlook apps.

I had already implemented the float mode, but needed to allow people to interact more meaningfully with apps while on the move. Before people could buy into the idea of floating apps, they needed to see how it felt to use them.

Prototype

Follow Belt

In the Follow Belt prototype, "float" was renamed to "follow." When a user selects "Follow" on an app's holobar (i.e. toolbar), that app will follow and maintain its position relative to the user, revolving around them along an invisible follow belt.

By grabbing an app's holobar, users can move and rearrange apps along their follow belt. The follow belt acts as a "portable workspace," allowing you to take multiple apps with you while on the go.

Results

The Summoner never made it into the Hololens shell, but carry/float/follow went on to become the "Follow me" mode, with a button located in the top right corner of every app window.

05

Targeting Game

Problem

We were asked to create a prototype for user research that would record telemetry data on user air tap accuracy (i.e. finger press or "click"). We decided that a simple targeting game would be the best way to gather data from users.

🎯The Goal: Create a targeting game that uses telemetry to gather information on user air tap accuracy.

Design

On the Hololens, gaze input tracks where a person's head is turned and if they're looking at ("targeting") any objects. Hand gestures work in conjunction with gaze to select or interact with targeted items.

There were concerns about gaze and air tap accuracy, so we wanted the prototype to reveal just how difficult it is for users to gaze at and select targets of varying sizes.

Targeting Game - Play Space
Targeting Game - Play Space

Prototype

I worked with David Calabrese to build the Targeting Game prototype. For the game's theme, we went with a balloon popping carnival game.

There are three rounds with three waves of balloons in each round. Smaller balloons are worth more points (harder targets to hit) and bonus points are rewarded for faster targeting. After the final round, users are shown their score and can enter their initials.

Results

There wasn't really an end result to this prototype other than that we were able to hand it off to user research, who ran tests and hopefully put the results to good use. I was just happy for the excuse to make a small game.

06

Virtual Assistant

Problem

The Virtual Assistant was a challenging series of prototypes to create. The purpose of it was to test out, and either validate or reject, a number of design ideas for how the Cortana virtual assistant should act or behave.

🎯The Goal: Explore interactions with the Cortana virtual assistant and its behavior, locomotion, and responses to the user.

Design

The Cortana virtual assistant is the physical embodiment of the Cortana voice assistant, but built specifically for the different types of input available in the Holographic shell (gaze, voice, gestures, environment). The primary means of interacting with the assistant is through voice commands.

Scott Petill was the lead designer that I worked with, his documentation served as a guide for the prototypes. The virtual assistant was designed to help users with various tasks, answer questions, and act as their companion.

Prototype

Cortana Scale

One of the earliest prototypes was for figuring out the ideal scale of the virtual assistant. Users can tap to move the assistant around or scale them up and down by dragging their hand.

Prototype

Cortana Motion

This prototype explored the different types of locomotion the virtual assistant is capable of. The assistant can navigate the space intelligently, avoiding collisions as it walks along a surface mesh, or switching to hover in the air when there are gaps or large height differences. Users are able to issue different voice commands:

  • "Come here" - Summons the assistant to you.
  • Move/go there" - Makes the assistant walk or fly to the point you're looking at.
  • "Teleport there" - Causes the assistant to instantly teleport to where you're looking.
Prototype

Engagement Sandbox

This prototype was primarily focused on enabling conversations with the virtual assistant. I worked with Jean-Louis Villecroze to hook up the Azure Cognitive Services, granting the assistant natural language understanding and the ability to respond to your questions (albeit in a pre-scripted or sequenced manner).

Additional work was put into the assistant's animations and different states, such as when it's listening, thinking (processing speech and generating a response), responding, or idle.

Prototype

Interaction Sandbox

This prototype was an extension of all the previous ones. New functionality was added to the virtual assistant for opening apps like Edge (a fake version), and responding to questions like "How are you doing?" or "What's the meaning of life?"

The assistant's interaction modes were also expanded upon. These concerned things like whether or not to show UI around the assistant, or if you could look at it and speak without needing to say "Hey Cortana/Lumen" first.

Results

The Cortana virtual assistant never made it into the Hololens shell, but work on it continued in some form or another for several years, passing between different teams and evolving over time.

07

Holobar

Problem

The Holobar was one of the first prototypes that I worked on when I started at Microsoft. Users needed a way to easily access controls for content placed in mixed reality space. The solution was the holobar, a toolbar designed for both slates (2D app windows) and live cubes (3D holograms).

🎯The Goal: Implement the holobar and all of its functionality for slates and live cubes.

Design

The holobar's purpose was to add affordances to each object, improving their usability and allowing users to adjust their content. The holobar contains a collection of controls that perform actions such as 'close', 'organize' (move or scale the object), 'carry' (object follows your gaze), and 'back'.

All of the designs for this project were created by Yasaman Sheri, and I was responsible for building them in Unity.

Prototype

This prototype added a functioning holobar to both slates and live cubes. The buttons along the holobar enable the user to carry or remove an object from their world. Organize mode was only visually stubbed in, and would later receive its own prototype.

Results

The Holobar prototype was a success and its functionality was reused across future prototypes. The work that we started evolved and made its way onto every object in the shell.

08

Transitions

Problem

The purpose of the Transitions prototype was to test how users respond to wait times while using the Hololens. The team also wanted to know if the bounding box animations were clear enough for users to tell that their content is loading.

🎯The Goal: Recreate the loading animation design and add transitions between immersive apps and the shell.

Design

The visual design team created a couple of videos demonstrating the loading animation as it animates across an app's bounding box. Apps launch in placement mode, and after tapping to place them, they play their loading animation before opening up.

Prototypes

In the 1st prototype I implemented the animations for launching and loading apps. In the 2nd, we were looking to get a better feel for the transition effect that ripples across the surface mesh when launching or closing immersive apps (which take over your entire view).

Results

The loading animations and transitions received more iteration from the visual design team. Mixed reality apps today feature a circular progress indicator when loading content. I never found out how users reacted to the wait time, but I'm sure they weren't thrilled.

09

Conclusion

Reflection

My time on the Hololens Design Team was a whirlwind experience. It was my first job in big tech out of college, so there was a lot to learn and adjust to.

I came in as a designer, yet performed what was mostly an engineering role for a couple of years. Thanks to my team, and their patience, I was able to quickly ramp up and make a contribution to the Hololens.

Takeaways

  1. One of the biggest selling points for AR is taking your work on the go, freeing yourself from the constraints of desks or screens.
  2. Decorating a space with holograms is a compelling experience, that will only get better as more devices gain spatial awareness.
  3. There's still a lot of untapped potential in AR for virtual assistants and the recent advancements in natural language understanding.

Credits

Engineers

Project Managers

Designers

Hololens - Design Team
Hololens - Design Team