Keep It Simple: Thoughts on Designing for Augmented Reality

We know that mobile is here to stay. But as a designer, I ask myself, “What now? What will be the next big thing?” What if designers could take their interface designs and put them in the real world? That would be pretty cool, right?

Well, the ability to have a Mission Impossible interface has arrived through what are called “first-person” user interfaces.

So, what are they? Luke Wroblewski describes first-person interfaces as the following:

“First person user interfaces can be a good fit for applications that allow people to navigate the real world, “augment” their immediate surroundings with relevant information, and interact with objects or people directly around them.”

Predictions show that first-person user interfaces will one day be as popular as graphical user interfaces. These interfaces leverage the affordances of mobile devices, such as GPS, camera and Internet. Luke’s graphic below demonstrates that this interface is emerging in the industry and how it stands in relation to other types of user interfaces. This graphic suggests that mobile devices are in a period of growth, while wearables are an emerging trend.

Source: www.lukew.com

So what should a designer focus on when creating these types of interfaces?

First person mirrors your perspective of the world. Luke says that these types of interfaces are typically utilized for applications that involve a GPS or navigation. We are taking the natural interface and using heads-up display (HUD) elements to emphasize the first-person view.

He also suggests that a designer should use colors and graphics that more accurately match real surroundings. This is vital to the user experience since they are interacting with the world as they experience it. As always, when designing for mobile, designers should consider gestures that are typically used for that device. For example, if the standard action is swiping, then use a swipe action.

What are the typical HUD elements found in first-person interfaces? Luke says that data windows, markers, orientation, transparent UIs, and spatial grids are utilized within the HUD.

Let’s look at the Layar app. Basically, it is a free augmented reality application that displays digital information on top of what you see through your camera.

 

Source: http://5election.com/2010/06/18/augmented-reality-a-short-history-of-mobile-ar/

The designer needs to consider the negative space that occurs when designing the HUD elements that relate to the background elements. For example, in the Layar app, the underlying imagery is so versatile. Additionally, the graphical HUD elements should be non-photographical and flat on top of the real-world imagery. Augmented reality systems superimpose graphics for every perspective and adjust to every movement of the user’s head and eyes. We see in the upper right area of the Layar interface an indicator that displays in which location the user is currently.

Imagery and content also reside within the space in order to help inform the user about that specific location. For example, we see that the Layar interface has direction symbols and contact information that appears once an icon is selected. In addition, ambient light/dark environment awareness on the device is important and must be considered to help resolve this problem. The Layar app fades the icons that are receding in space and has a full opacity to the icons that are close to the user. It is important for the designer to have minimal navigation and use transparency in order to convey the information to the user with ease.

Below is another app, Nearest Places, (which has since been swallowed into Augmented Reality Browser).  This utilizes appropriately sized targets and maintains spacing between the targets very well.

I want to take note of the great use of contrast between the brightly colored icons and the dark backgrounds of the markers. Since this app is in a city setting, the designer imagined that the background environment would likely appear textured and muted. This is why the solid, bright, non-textural markers and icons work so well within the application.

You can also see that they focused on the navigation first, then the interaction design, which is vital to the success of the application. The capabilities of the device were considered such as device positioning, motion, orientation and proximity. The app shows the user’s position and direction, which is a great use of an extension of the user to the device.

 

Source: https://itunes.apple.com/us/app/new-york-nearest-subway/id323100520?mt=8

I find that iconography is so vital to the success of this type of interface. If designers don’t focus on the importance of symbolism and research in semiotics, the success of this type of interface could be minimal. For example, the amount of research that was conducted when designing the highway road sign was massive. I have no doubt that we will need to do the same with first-person user interfaces so that there is great clarity for the user.

The beauty of a first-person interface lies in its ability to directly translate real-life perspectives. After reading about first-person interfaces, I remembered the old saying KISS. Simplify, simplify, simplify.

A designer needs to remember that content is first when designing for mobile and to shed unnecessary details. Think of the device as an extension of yourself.

 

Leave a Reply