FocusHUMAN-ROBOT INTERACTION

The uncanny valley of haptics

See allHide authors and affiliations

Science Robotics  18 Apr 2018:
Vol. 3, Issue 17, eaar7010
DOI: 10.1126/scirobotics.aar7010

Abstract

During teleoperation and virtual reality experiences, enhanced haptic feedback incongruent with other sensory cues can reduce subjective realism, producing an uncanny valley of haptics.

In the field of humanoid robotics, most people are familiar with the notion of an “uncanny valley” (1): the phenomenon whereby increasing the realism of a robot—its human-like appearance or movements—yields feelings of unease, or even revulsion, in people as its representation becomes more and more (but never quite fully) human-like.

We took this notion one step further by examining whether an uncanny valley also exists for human perception of forces (i.e., tactile sensations) that might be rendered during human-robot interaction, teleoperation, or other virtual manipulation tasks in virtual environments (2). That is, do enhancements of the “actual” forces applied by robots (or other devices) necessarily lead to an improved subjective experience by the human operator?

We argue that the answer is no: The subjective perception of haptic sensations by a human operator critically depends on the fusion of haptic and visual stimuli as a unitary percept in the human brain (3). If the fidelity of the haptic sensation increases but is not rendered in concordance with other sensory feedback (such as visual and auditory cues), the subjective impression of realism actually gets worse, not better. We refer to this degradation as the uncanny valley of haptics (Fig. 1A).

Fig. 1 Uncanny valley of haptics.

(A) The theoretical uncanny valley of haptics as defined by studies from the classic humanoid robotic uncanny valley (1). (B) The empirical data from our experiments. The subjective experience corresponds to the Presence Questionnaire score. Error bars represent SEM. (C) A diagram showing the stimulation paradigm for producing the illusion of spatialized haptic feedback via funneling. In generic haptics stimulations, the same amplitude of vibrations was delivered for all trials to both controllers. No funneling occurs in such conditions. However, under the spatialized and visual + spatialized conditions, a funneling effect was achieved by varying the vibrotactile amplitude delivered at each controller, producing a change in the perceived haptic location. (D) Inside the VR headset, the participant sees a (virtual) wooden dowel that bridges their hands (as sensed by the position and orientation of the controllers). In the passive and causal experiments, the participant held the dowel in a specific “activation area” to receive the haptic stimuli (represented by a “cloud” that looked like a smoky cylinder). During the visual + spatialized stimulation, participants saw a white marble cue that visually reinforced the location of the haptic feedback.

To demonstrate this effect and its implications, we used a virtual reality (VR) system as an experimental test bed, with haptic sensations delivered via a handheld controller in each hand. We elicit a phantom touch illusion using a technique known as funneling. Funneling provides the user with synchronous vibrotactile stimuli of different amplitudes from controllers that are physically (or, in our case, virtually) linked (Fig. 1C). When human participants hold a controller in each hand with vibrotactile haptics rendered in this manner (Fig. 1D), they experience the haptic sensation as localized in space (“spatialized”). And paradoxically, it “feels like” it originates in the empty space between the two hands (4). What is happening is that, upon the arrival of two near-synchronous tactile cues, the human brain integrates the stimuli. That is, the brain assumes that the two stimuli have a common source—and not just in time, but also in space (5).

Note that this experimental setup serves as an ecologically valid proxy, carefully designed to sensitively probe the potential influence of haptic stimuli, for a variety of teleoperation tasks. This is important because augmenting such tasks with higher-fidelity haptic sensations may come with the (oft-unstated) assumption that such “improvements” will always yield more realistic and immersive virtual environments. Of course, realism and immersion are subjective perceptions (6), but we can formally assess and quantify them using scientifically established presence questionnaires (7).

We ran several experiments (see the Supplementary Materials) to better understand the dynamics of haptic perception and how to elicit the aforementioned uncanny valley of haptics—and perhaps more importantly, how to avoid it. These experiments studied passive haptic stimulation (i.e., when the participant passively receives a haptic stimulation without moving their arms) contrasted with dynamic haptic stimulation (triggered by the movements of the participant). Research on humanoid robotics has shown that the feelings of unease (or even revulsion) associated with the classic notion of an uncanny valley can be shifted or eliminated (1) by manipulating various aspects of the simulations. For example, cartoonish features can reduce the mismatch between the human-likeness of a robot and its perceived realism (8). To see if a participant’s top-down expectations influenced the results, we also probed causal haptic stimulation with a condition in which users could plausibly attribute an external cause. This took the form of an animated cloud that partially obscured the view of the funneling effect’s location, thereby “explaining away” any discrepancy in haptic sensations.

Our results show that participants could localize the vibrotactile stimuli in different locations (4), establishing the spatial haptic effects. However, the experience—the overall sense of immersion—dipped as this increasing realism of the haptics exceeded the complementary cues (from other senses) in the simulation (Fig. 1B). These findings therefore support the existence of an uncanny valley of haptics.

Likewise, our results demonstrate techniques to reduce and recover from the uncanny valley of haptics. For example, in the dynamic haptic stimulation, asking the participants to perform a motor action was sufficient to provide a “reason” for the haptic sensation, bringing the subjective experience back into agreement with the perceived realism. In addition, in our probe of causal haptic stimulation, providing an animated feature (a moving cloud) that could plausibly “cause” the mismatch between senses was sufficient to preserve the subjective experience.

An uncanny valley of haptics means that designers of human-robot interactions cannot simply assume that more (or more realistic) haptics is better. As experiences move beyond purely visual displays and integrate richer feedback from multiple senses, including haptic and auditory sensations, mismatches become possible and may undermine “improvements” to haptic rendering.

Subjective incongruences produce conflicting percepts across multiple sensory channels. When the human brain subconsciously integrates these conflicting cues into a unified percept (3, 9), the result may be reduced subjective experience (i.e., a decreased sense of immersion). Our finding of an uncanny valley effect for haptics calls for a shift in focus in the design of human-robotic interactions from precision to context and suggests a need for a multi-modal approach to haptic feedback—a holistic approach that incorporates multiple human sensory channels into design, rendering, and evaluation of haptic sensations in the user experience.

Although demonstrated in a VR test bed, the effects are rooted in human perception and as such could affect the perceived realism and immersion manifest in many real-world applications, such as teleoperation scenarios, remote robotic manipulation, or even telesurgical tasks. Our study offers insights, methods, and results that may boost future endeavors to render haptic effects that improve (rather than detract from) the overall user experience.

SUPPLEMENTARY MATERIALS

robotics.sciencemag.org/cgi/content/full/3/17/eaar7010/DC1

Materials and Methods

Results

Fig. S1. Reported spatial haptic perception.

Table S1. Questionnaire and factor loadings.

Table S2. Main experiment (passive) results.

Table S3. Summary of learnings and recommendations from the uncanny valley of haptics.

Movie S1. The uncanny valley of haptics.

Data file S1. Anonymized questionnaire responses for all experiments and conditions.

References (1015)

REFERENCES AND NOTES

Funding: This study was funded by Microsoft Research. C.C.B. was supported by a grant from the Swedish Research Council (2017-00276). Author contributions: M.G.-F. and C.C.B. defined the studies, implemented the simulations, and analyzed the data. C.C.B. ran the user studies. E.O., K.H., M.G.-F., and C.C.B. wrote the paper and established the uncanny valley of haptics theory. Competing interests: The authors report their affiliation to Microsoft, an entity with a financial interest in the subject matter or materials discussed in this manuscript. However, the authors have conducted the studies following scientific research standards and declare that the current manuscript presents balanced and unbiased studies and theories. Data and materials availability: The anonymized data can be obtained from the Supplementary Materials. The principal component analysis was performed using https://notebooks.azure.com/margon/libraries/QuestionnairePCA.
View Abstract

Navigate This Article