Minimize abstractions

Design tools with volume and affordances

Now that we understand how to arrange objects in space, how should we design the objects themselves? Replace abstract representations (like flat icons) with volumetric tools featuring physical characteristics that suggest their use without explicit instruction, leveraging the Neuroscience of Affordance.

Minimize abstractions

UI design suggestions

1. Design with affordances in mind
Design tools with “affordances”: physical characteristics that suggest how the tool should be grasped or used. For example, a holographic eraser might have grooves to invite grasping it on one side, and a flat surface for erasing on the other.

2. Build on the user’s prior knowledge of a tool instead of defining your own
The design of a tool or object should reflect the user’s contextual understanding of its behavior in the real world (or it’s closest real-world counterpart). This reduces the effort required by the user to understand how it functions in AR. For example, don’t reinvent the paintbrush in some radically new way—instead, base its design on the brushes your users are likely to have seen and used in the real world.

3. Use tools only if direct hand manipulation is insufficient or biomechanically challenging
For the reasons described in Principle 5 (“Touch to See”), direct hand interaction is preferable for most simple tasks, such as moving, rotating or scaling an object. However, if the task requires raising the hand mid-air for extended periods of time, such as sculpting, or demands more than basic forms of manipulation, like slicing or chiseling, the use of tools, designed with universally understood priors2, is recommended. See “Touch to See” section.

4. Avoid use of buttons in AR as much as possible
The functionality of buttons is often better achieved in AR using alternate methods of design. For instance, use volumetric tools instead—designed with forms that suggest their function—to ensure an intuitive experience (see the next principle for a more detailed explanation of how to construct such tools). Furthermore, momentary points of decision or confirmation, such as dialog boxes, OK/cancel buttons, and so on, can instead take place through experiences in which the user naturally conveys their intent through their actions. For instance, rather than presenting a “send” button when the user writes an email, provide a mailbox that the user can drop it into.

5. Compensate for lack of haptics with other cues
For instance, consider synchronized audio cues when the user’s hand interacts with tools and content, or build hybrid tools that mix digital interfaces with real world objects (e.g., a UI that is placed against real world objects), or use physical markers that permit holographic drawing in space. These kinds of peripherals are considered “natural” since they leverage priors from existing physical tools such as paint brushes.

…Before you object! Holographic Tool Making

Ideal holographic tool making requires ideal hand tracking and pose estimation which is not present as of 2017. For example, the use of the paintbrush in the visual example above requires an understanding of the full pose of the brush in space. We recommend identifying the strength and limitations of the tracking technology and working with it. For example, the location of the palm and ability to squeeze are reliable, so a paintbrush tool can be “attached” to the palm, and drawn between the palm and the index finger. Painting might then occur when the user pinches.

don’t reinvent the paintbrush in some radical new way…

Other examples of reliable hand features include the “hand-openness” metric which provides real-time calculation of how open or closed the hand is. Using this system, the brush can produce strokes of color with a thickness that varies depending on how tightly the user squeezes. Painting ceases when the user’s hand is completely open.

The neuroscience behind it

By drawing on real-world experience, holographic interfaces designed
to look and function like physical objects can be understood by a user immediately. Meta advises leveraging such instincts wherever possible instead of relying on abstractions that must be deciphered or explained—creating a true “no learning curve” interface.

Objects and tools designed with affordances—physical characteristics that suggest their use and function, such as the handle on an axe—are understood more quickly and with greater depth than flat icons that must first be decoded as abstract symbols. The brain’s action planning systems then automatically formulate a sequence of movements to grasp and use the tool based on these cues.

Affordances play a major role in the design of an ideal Spatial Interface, as they directly leverage the user’s priors to reduce the learning curve. An understanding of the user’s likely priors is therefore critical to putting this principle to use effectively.

Further Study

The mechanism for recognizing affordances and reacting appropriately lies in Area F5 of the premotor cortex, where a set of neurons known as “canonical” neurons “match the shape and size of an observed object with prehension (grasping) or other actions” (Rizzolatti and Luppino, 2014).

Paper: Rizzolatti G., Luppino G. (2001) The cortical motor system. Neuron. 31(6):889-901.

2 So universal are tools that our species, Homo, was defined by it. In fact, Homo Habilis began using a stone tool by the name of “Mode 1 Rock” in the African Savanna over 2.5 million years ago. This crude stone tool showcased a chipped edge and allowed our ancestors more control over their environment and gain. We have co-evolved with similar stone, tools for such extended periods of time that it is no surprise that the human nervous system is well specialized towards its use. In contrast, we’ve spent a total of a few decades with the abstract menus and icons of the WIMP (Windows, Icons, Menus, Pointers) era. Some readers raise the point that modern may be superior. However, two things should be mentioned in this context: first, until AR/VR computers, humans were unable to render digital volumetric tools in space, so they are technically more modern than buttons. Second, from the standpoint of bayesian neuroscience, the longer a particular behavior occurred, the more neurocircuitry is specialized for it, and the less efficient it is to deviate from those priors for accomplishing a given task. For example, if the task requires spinning an object, in a very literal sense ‘why reinvent the wheel?’