The augmented reality (AR) pattern helps the user identify location and nearby features by visualizing them through superimposing digital information on whatever the user is looking at through their phone camera.
Studying a map takes time, and it’s often difficult to orient yourself. It’s hard to identify and locate points of reference in relation to your current location. This is especially true during time-sensitive tasks or in emergency situations in which objects are obstructed because of smoke, flood, snow, or nighttime darkness. In these situations, the sensors of the phone can detect more than just the current location — they also sense the direction the camera is facing and the inclination of the device in the user’s hand. Combining these variables with reliable GIS data reduces the user’s cognitive load and speeds up the decision-making process while the user is on-site at a location.
AR is useful whenever features are directly related to the user’s current location. It’s easy to look through the lens of a camera and see where the nearest exit is, even if the exit is temporarily hard to find. Utility workers, for instance, can visualize obstructed assets and critical infrastructure, such as water pipes and electrical cables that are behind walls or buried underneath the street’s surface. AR helps guide on-site workers in locating these otherwise invisible underground utilities.
Using AR in conjunction with other sensors such as the camera allows law enforcement to look at a suspect’s car through the viewport of the phone, scan the license plate, retrieve the driver information, and display it right next to the car. People interested in real estate can look at a building and see the apartments available for rent, including their location, price, and contact information.
AR is best used outside so that highlighted features and points of reference are visible. It’s more difficult to achieve a similar experience inside a building where walls block the view, and the device location may be less reliable.
Provide location-based labels, sometimes called billboards, with basic information about the feature or features in question. Keep the titles short and descriptive without superfluous wording or descriptions. Relate the location of the object to the user’s location by displaying the distance in feet or meters. Avoid overlapping labels through prioritization and use callouts. Callouts that are further away from the current location may be displayed smaller to indicate their distance from the viewer.
Allow users to interact with an object by selecting it to open an info pop-up or info panel. The pop-up can show additional information such as photos, contact information, or reviews. Sometimes it is beneficial for users to switch the view from AR to a map to get the full picture or to show a small portion of the map side by side with the AR view. Common actions available in the pop-up are getting route directions, adding a note, using the object as a starting point for a workflow, or sharing the object with friends or coworkers.
Consider inevitable fluctuations in GPS location and data accuracy, which may result in imprecise representations of the object’s location. The same applies to measurements calculated through AR that are mostly only approximations.
It is important to note that walking or driving while looking through the lens of the upheld phone is not only tiring but also dangerous. Therefore, the app should take precautions to remind or warn the user to watch their surround-ings to prevent and avoid injuries.
AuGeo is an Esri Labs initiative to use ArcGIS data in an AR environment. While using the app, users hold their phone to see data from an ArcGIS point feature layer superimposed on their screen. The example below shows how the app can help you identify where the ski lifts are located. It’s easier to look through the lens of your readily available phone than studying a paper map and figuring out where the lifts are in relation to your current location.