A Quick Guide to Designing for Augmented Reality on Mobile (Part 2)

Bushra Mahmood
5 min readFeb 22, 2018

This article is part 2 of an ongoing series, catch up on Part 1 Part 3 and Part 4.

Before We Start…

I often get asked about the required skills and tools for designing AR experiences. The truth is, with the technology still in its infancy, there is no straightforward answer. The means for designers are slowly starting to surface, and the exact skills depend on how motivated a designer is. The skills will be covered on a later date.

The First Skill

However, there is a skill worth mentioning since it is the foundation of every experience:

Storytelling: The ability to explain a sequence of events or actions over time effectively.

It’s a step that is crucial for any motion or animation work and applies to AR design, too. The more detailed and specific the descriptions are, the easier it is actually to implement and eventually test.

“The user puts an object in the room”

vs

“The user drags and drops a 3D object on to a ground plane.”

The Basics

The following are a collection of some basic interface patterns and behaviors that have started to emerge and worth considering when designing an AR experience. For the sake of starting with the absolute basics, this will not cover object or facial recognition.

Adding content

How an object gets introduced into space can set the pace for the rest of the experience.

AUTO VS MANUAL

​​Should your content automatically place itself in the environment? Or should the user place the content manually?

A) Automatically insert content (e.g., candle appears on top of a table when the app launches).

B) Interacting with the target/space determines where the content gets placed (e.g., tap on a table to add a candle).

C) The user specifies the placement of the content by determining where it gets put (e.g., drag a candle on to a table).

WWDC 2017 Apple AR Kit Demo

PLACEMENT INDICATORS

​​Use placement indicators to call out areas where a user can place content.
How the indicator is visualized can also help solve several tasks all at once.
A square indicator, for example, is a strong pattern since it references the angle and perspective of the ground plane.
It may also reference the scale of the content relative to space. The indicator may also animate or have a compelling sequence that makes the interaction more meaningful (e.g., object appears vs. object unfolds out of a box)

(left) Lowes Vision (right) Wayfair App

ALERTS & REWARDS

Rewards motivate users to continue a story, move on to the next step, or complete an achievement. Alerts can grab their attention when they’re facing the wrong direction.
It is important to have significant moments and actions visualized in a way that considers the whole environment (e.g., confetti exploding everywhere after an achievement).

In Conduct AR, once a level is completed, fireworks light up the space to make sure the user doesn’t miss the moment

OFF SCREEN INDICATORS

Prepare and scope for ways to alert a user if an object moves or lands outside of where the device is facing. Video games often use this convention with a dynamic arrow at the edge of a screen, or a compass that points in the direction of the obscured content.

Interacting with content

Since current mobile AR involves touching a glass screen, Users will always have some form of friction between what they want to do and what they can do. There are no guarantees that all users will instantly know how to interact in 3D space on mobile. The following are some patterns that aid in informing and giving clues throughout the experience.

SNAPPING

Snapping refers to automatically aligning or referencing a guide. It’s similar to everything having a weak magnetic force that gets stronger the closer it is to another magnet.

If you push a couch against a wall in the real world, gravity and physics create constraints which prevent the furniture from floating and going through the wall. We inherently understand this since it is how we evolved and work.
In AR the rules change a bit since you have to design these constraints and this logic.
The following are some examples:

A) Snap to object (e.g., a couch fits evenly next to the edge of a table).
B) Snap to environment (e.g., a couch sits even on top of a rug ).
C) Snap to guide (e.g., a table is centered in the middle of a rug).

(right) Wayfair App

GUIDES

Consider visualizing shadows, cages or guides to show the potential of an object.

  • Cage: A 2D or 3D boundary that references the volume, scale, and shape of something.
  • Shadow: Represents proximity and how far or close something is.
  • Guide: Lines or cues that represent alignment and proximity.
  • Gizmo: A visual indicator for movement, rotation or scale — useful for when a user can only do a single action.
  • Parameter: A reference to the degree and value of change.

Stay tuned for Part 3 where I discuss styles guides and UI elements.
Also, an extra special thanks to Devon Ko for her style and editing help.

--

--

Bushra Mahmood

Retired promoter from Canada now working on platforms at Adobe, previously Unity, previously Adobe.