What Problem Does This Solve?
Relate and visualize nearby features to the user’s current location by superimposing digital information on whatever user is looking at through the phone’s camera.
When to Use This Pattern
Augmented Reality is useful whenever features are directly related to the user’s current location. It’s easy to look through a camera and see where the nearest exit is, even though it may be obstructed or hard to find because of bad light. Utilities and natural resources can visualize assets and critical infrastructure like water pipes and electrical cables that is behind walls or buried underneath the street’s surface and guide fieldworkers in locating these underground utilities.
AR is best used outside so that highlighted features and points of reference are visible. It’s more difficult to achieve a similar experience inside a building where walls block the view and the device location is less reliable.
Using AR in conjunction with other sensors like the camera allows law enforcement to look at a suspect’s car through the viewport of the phone, scan the license plate, retrieve the driver information and display it right next to the car. Pattern recognition software may analyze faces or read bar codes to find additional content.
People interested in Real Estate may look at a building and see the apartments available for rent including their location, price, and contact information.
What’s the Solution?
Provide location-based labels, sometimes called billboards, with basic information about the feature or features in question. It works best to keep the title short and descriptive with very little superfluous wording or descriptions. Relate the location of the object to the users’ location by displaying the distance in feet or meters.
Avoid overlapping labels through prioritization and by using call-outs. Popups that are further away from the current location may be displayed smaller to indicate their distance.
Allow users to interact with an object by selecting it to retrieve additional information like photos, contact information, or reviews in a detail view. Sometimes it is beneficial for users to switch the view to 2D to get the ‘full picture’ or to show a small portion of the 2D map side by side with AR view. Common actions invoked from the popup or the details view include showing routing directions, adding a note, using the object as a starting point for a workflow, or sharing the object with friends or co-workers.
Why Use This Pattern?
Using a map is sometimes not practical because it takes too long to orient oneself or points of reference are hard to identify or locate. This is especially true during time-sensitive tasks, in emergency situations when objects are obstructed due to smoke, flood, snow, or nighttime.
The sensors of the phone can detect more than just the current location, it also knows the direction the camera is facing and the inclination of the device in the user’s hand. Combining these variables with GIS data helps making decisions on-site using the most up-to-date and reliable information available.
Consider inevitable fluctuations in GPS location and data accuracy which may result in imprecise representations of the object’s locations and thus may not always be as reliable as could be. The same holds true for measurements calculated through AR that are mostly only approximation.
It is important to note that walking or driving while looking through the lens of the upheld phone is not only tiring but also dangerous. Therefore the app should take precautions to remind or warn the user to watch their surroundings or help in some fashion to avoid injuries.