Research ArticleHUMAN-ROBOT INTERACTION

Improving social skills in children with ASD using a long-term, in-home social robot

See allHide authors and affiliations

Science Robotics  22 Aug 2018:
Vol. 3, Issue 21, eaat7544
DOI: 10.1126/scirobotics.aat7544
  • Fig. 1 Robot-assisted intervention system.

    Our system consists of a social robot, touch screen monitor, and two RGB cameras. The system supports triadic interactions between the robot, the child, and the caregiver. Software running on the perception computer uses an elevated camera to track both the child’s and caregiver’s attentional foci, whereas the other camera records the intervention session (Fig. 2). The main computer controls the flow of the intervention and the robot’s behavior to ensure presentation of coherent, meaningful intervention.

  • Fig. 2 A typical interaction between the robot, the child, and the caregiver during our deployment.

    Our robot system was designed to engage and facilitate interactions between the child and the caregiver, therefore providing opportunities for the child to practice social skills in a fun, natural way.

  • Fig. 3 Robot-initiated joint attention.

    The robot models appropriate social gaze behavior by demonstrating context-contingent gaze and facilitates mutual gaze and experience sharing between the child and the caregiver. When the child is engaged with the robot (A), the robot directs the child’s attention to relevant task content on the screen (B). As the child’s attention shifts to the robot-directed focus on the screen, the robot then attempts to redirect gaze to the caregiver (C) in the hope of redirecting the child’s visual attention to the caregiver (D). (These demonstration images were recreated in the laboratory to show both robot and child behavior because this perspective was not recorded by the deployed system.)

  • Fig. 4 Screenshots of social skills games.

    A set of interactive games were developed to allow children with ASD to practice social skills through play. The games were designed to support interactions between the caregiver and the child and between the robot and the child. The games targeted three social skills, including social and emotional understanding (A) (Story), perspective-taking (B) (Rocket), and ordering and sequencing (C) (Train).

  • Fig. 5 Proportion of maximum level achieved as a function of game session.

    Curves were modeled in a binomial generalized linear mixed model with session and game as fixed and random effects. The 95% confidence intervals are shown. Children advanced in the level of each game when they achieved over 75% of correct answers and regressed a level when giving less than 25% correct answers. When achieving between 25 and 75% of the correct answers, the children would remain at the same level.

  • Fig. 6 Result of joint attention assessment.

    Probe scores for the child at four different time points: 30 days before the robot intervention started, on the start day of the robot intervention, on the last day of the robot intervention, and 30 days after the end of the robot intervention. There was a significant increase in joint attention scores when comparing before the robot intervention and after it. n.s., not significant; *P < 0.05. Error bars indicate SE.

  • Fig. 7 Result of caregiver survey.

    Caregivers reported increased eye contact, increased initiation of communication, and increased response to communication bids with them (A) and with other people (B). On the basis of comparisons of ratings from the last day of the robot intervention (T2) to the first day of the intervention (T1), these results showed that caregivers were able to observe improved communication abilities of the children beyond our robot-assisted intervention sessions over the period of 30 days. Error bars indicate SE.

  • Fig. 8 Diagram of software components.

    Our software system consists of several components responsible for attention tracking of the participants, robot behavior control, and intervention presentation. These components together create rich, engaging interactions for our robot-assisted autism therapy. These components operate within the ROS framework.

Supplementary Materials

  • robotics.sciencemag.org/cgi/content/full/3/21/eaat7544/DC1

    Movie S1. The robot leads the child and the caregiver into an interactive barrier game in which the child builds a rocket and then explains to the caregiver their rocket.

    Movie S2. The robot tells a story and asks the child how the main character is feeling at a certain point in the story.

    Data file S3. Gameplay data set.

    Data file S4. Joint attention data set.

    Data file S5. Caregiver survey data set.

  • Supplementary Materials

    Download PDF

    Other Supplementary Material for this manuscript includes the following:

    • Movie S1 (.mp4 format). The robot leads the child and the caregiver into an interactive barrier game in which the child builds a rocket and then explains to the caregiver their rocket.
    • Movie S2 (.mp4 format). The robot tells a story and asks the child how the main character is feeling at a certain point in the story.
    • Data file S3 (.csv format). Gameplay data set.
    • Data file S4 (Microsoft Excel format). Joint attention data set.
    • Data file S5 (.csv format). Caregiver survey data set.

    Files in this Data Supplement:

Stay Connected to Science Robotics

Navigate This Article