Research ArticleCOLLECTIVE BEHAVIOR

Implicit coordination for 3D underwater collective behaviors in a fish-inspired robot swarm

See allHide authors and affiliations

Science Robotics  13 Jan 2021:
Vol. 6, Issue 50, eabd8668
DOI: 10.1126/scirobotics.abd8668
  • Fig. 1 Blueswarm platform.

    Bluebot combines autonomous 3D multifin locomotion with 3D visual perception. (A) Two cameras cover a near-omnidirectional field of view (FOV). One caudal and two pectoral fins enable nearly independent forward and turning motions; a dorsal fin affects vertical diving for depth control. (B to D) Seven Bluebots with streamlined, fish-inspired bodies are used in Blueswarm experiments. (E and F) The fins are powered by a custom electromagnetic actuator (see Materials and Methods). (G) Information on neighboring robots extracted from images enables local decision-making. (H) Fast onboard image processing is achieved by setting the cameras such that only the two posterior LEDs (and potential surface reflections) of neighboring robots appear in images. For illustration purposes, pairs of LEDs belonging to the same robot are color coded and have a white outline; a pair of the same color without the outline marks the respective surface reflection. (I) Bluebots’ relative positions and distances are derived from pairs of LEDs assigned to individual robots (color-coded vectors), facilitating self-organized behaviors such as visual synchronization, potential-based dispersion/aggregation, dynamic circle formation, and collective search.

    Photo credit (C): iStock
  • Fig. 2 Blueswarm’s distinctive features in comparison with other robot collectives.

    Blueswarm is a 3D underwater collective that uses only local implicit vision-based coordination to self-organize. Not depending on any assistance, Blueswarm is more autonomous than most aerial swarms. It advances fundamental research on decentralized and self-organized robot collectives from 2D to 3D space.

  • Fig. 3 Self-organization across time.

    (A) Fireflies flashing in unison. (B) Mirollo-Strogatz synchronization model: Firing agents pull up observers closer to their firing times, and the pull-up magnitude increases monotonically with time an observer spent already on a given firing cycle. Left: y fires and x is pulled up; right: x fires and y is pulled up; result: their phase difference (red) was reduced. After multiple such rounds, x and y will fire in unison. (C) Seven Bluebots observed LED flashes of neighboring robots and adjusted their flash cycles to achieve synchrony (solid) after three initial rounds of desynchronized flashing (dashed). The SD σ in flash times among robots disappeared after four and seven rounds of synchronization for uniformly (blue) and randomly (red) distributed initializations, respectively. (D) Robots with randomly initialized flash times synchronized slower because they partitioned into two competing subgroups (rounds 5 to 7). (E) Stills from the randomly initialized experiment show uncoordinated (top, round 1) and synchronized (bottom, round 10) flashing.

    Photo credit (A): iStock
  • Fig. 4 Self-organization across space.

    (A) A shoal of surgeonfish foraging in a reef. (B) During dispersion/aggregation, a Bluebot (black) calculates its next move (black vector) as the weighted average of all attractive (blue) and repulsive (red) forces from neighboring robots. (C) Interrobot forces (red) are calculated as the first derivative of the corresponding Lennard-Jones potential (blue) with standard parameters a = 12 and b = 6 and a tunable target distance dt (equal to 2 BL). The forces f are dependent on the distance d between robots: f = 0 for d = dt, f << 0 for d < dt (repulsive), and f > 0 for d > dt (attractive). The target distance dt defines the robot density of the collective. (D to F) 3D dispersion (blue markers, dt = 2 BL) and aggregation (red markers, dt = 0.75 BL) with seven Bluebots. Robot density ρ changed 80-fold; ρ = 455 m−3 equates to one individual per cube of BL, a density commonly observed in fish schools (64). (G) Dynamically repeated aggregation and dispersion by change of dt between 0.75 and 2 BL. Black arrows indicate increases (up) and decreases (down) in dt.

    Photo credit (A): Uxbona/wiki/File:Maldives_Surgeonfish,_Acanthurus_leucosternon.jpg; used under CC BY 3.0
  • Fig. 5 Self-organized dynamic circle formation.

    (A) A school of barracudas milling. (B) Dynamic circle formation with binary sensors: A robot turns clockwise if no other robots are present within a predefined segment view (orange) and counterclockwise if at least one robot is present (blue). The emergent circle radius R is determined by the angle of view α and the number of robots N with approximately circular bodies of radius ρ (Eq. 3). (C) Dynamic circle formation on Blueswarm (arrows indicate robot headings). (D to F) Experimental data from 3D dynamic circle formation with seven Bluebots, where each Bluebot maintains a preferred depth, resulting in a cylindrical shape: (D) trajectories of all robots, (E) distances to centroid of all robots (colors) and their mean (black, dashed), and (F) depths of all robots. (G) Addition and removal of robots during a continuous 2D experiment, demonstrating robustness of the formation process and emergent adjustment of circle radius R to the number of robots N.

    Photo credit (A): iStock
  • Fig. 6 Search operation composed from multiple behaviors.

    (A to D) Experimental validation: (A) Seven Bluebots were deployed centrally and searched for a red-light source at the left bottom corner of the tank. Robots switch between three behaviors: search, gather, and alert, indicated by blue, green, and yellow, respectively, in figure diagrams. (B) Initially, the robots dispersed to cooperatively locate the source. The first robot detecting the source switched to alert behavior, maintaining position and flashing its LEDs (15 Hz). (C) Other robots close by the source also detected it and switched to alert. Further away, robots that had detected the flashing LEDs switched to gather, turning off their LEDs and moving toward the flashing robots. (D) The experiment concluded when all robots had found the red-light source. (E) The timing of events shows the cascade of information spreading. The first robot detected the source after 20 s of controlled dispersion. Within 10 s, all other robots noticed its alert and started migrating toward the flashing LEDs. Incoming robots catching the light source started flashing as well to reinforce the alert signal. The source was surrounded by all robots after 90 s. (F) During search, all robots acted according to the same finite-state machine (nbr, neighbor).

  • Fig. 7 Dynamic circle formation geometries.

    (A) A regular polygon configuration with field-of-view sensors. Only two robots are shown for simplicity; additional robots lie on each vertex of the polygon. (B) Geometry for calculating the milling radius R with field-of-view sensors (γ, interrobot distance). (C) Scaling intuition based on Eq. 3 for circle radii R with nominal parameters (blue) and doubling of robots N (red), robot size ρ (yellow), and viewing angle α (purple), respectively.

Supplementary Materials

  • robotics.sciencemag.org/cgi/content/full/6/50/eabd8668/DC1

    Section S1. The Bluebot 3D vision system.

    Section S2. Controlled dispersion.

    Section S3. Dynamic circle formation and milling.

    Section S4. Multibehavior search experiments.

    Section S5. 3D tracking.

    Fig. S1. Additional components of the Bluebot platform.

    Fig. S2. Example scenarios for the blob detection algorithm.

    Fig. S3. Time complexity of rapid LED blob detection at an image resolution of 256 × 192.

    Fig. S4. Camera and robot coordinate systems.

    Fig. S5. x and y error in the Bluebot’s position estimation for 24 cases.

    Fig. S6. Error in the z direction as a percentage of distance.

    Fig. S7. Overall (x,y,z) error as a function of distance.

    Fig. S8. Interrobot distances during single dispersion/aggregation.

    Fig. S9. Interrobot distances during repeated dispersion/aggregation.

    Fig. S10. Number of visible robots during controlled dispersion.

    Fig. S11. Target distance and number of robots influence interrobot distances during dispersion.

    Fig. S12. Seven robots arranged in a regular polygonal formation with line-of-sight sensors tangential to each other.

    Fig. S13. Expanded view of the red quadrilateral in fig. S12.

    Fig. S14. Example configuration of two robots to show that the regular polygon formation of Theorem 1 is the only stable formation.

    Fig. S15. A pathological initial configuration for the case 𝑟_0 > 𝑅.

    Fig. S16. A regular polygon configuration with field-of-view sensors (cf. fig. S12 for line-of-sight sensors).

    Fig. S17. Geometry for calculating the milling radius with field-of-view sensors.

    Fig. S18. Tank setup.

    Fig. S19. Schematic side view of the tank showing the overhead camera and the salient parameters.

    Fig. S20. Schematic of the tank view as seen from the overhead camera.

    Movie S1. Introduction to Blueswarm.

    Movie S2. Collective organization in time.

    Movie S3. Collective organization in space.

    Movie S4. Dynamic circle formation.

    Movie S5. Multibehavior search operation.

    Movie S6. Summary and highlights.

  • Supplementary Materials

    The PDF file includes:

    • Section S1. The Bluebot 3D vision system.
    • Section S2. Controlled dispersion.
    • Section S3. Dynamic circle formation and milling.
    • Section S4. Multibehavior search experiments.
    • Section S5. 3D tracking.
    • Fig. S1. Additional components of the Bluebot platform.
    • Fig. S2. Example scenarios for the blob detection algorithm.
    • Fig. S3. Time complexity of rapid LED blob detection at an image resolution of 256 × 192.
    • Fig. S4. Camera and robot coordinate systems.
    • Fig. S5. x and y error in the Bluebot’s position estimation for 24 cases.
    • Fig. S6. Error in the z direction as a percentage of distance.
    • Fig. S7. Overall (x,y,z) error as a function of distance.
    • Fig. S8. Interrobot distances during single dispersion/aggregation.
    • Fig. S9. Interrobot distances during repeated dispersion/aggregation.
    • Fig. S10. Number of visible robots during controlled dispersion.
    • Fig. S11. Target distance and number of robots influence interrobot distances during dispersion.
    • Fig. S12. Seven robots arranged in a regular polygonal formation with line-of-sight sensors tangential to each other.
    • Fig. S13. Expanded view of the red quadrilateral in fig. S12.
    • Fig. S14. Example configuration of two robots to show that the regular polygon formation of Theorem 1 is the only stable formation.
    • Fig. S15. A pathological initial configuration for the case r_0 > R.
    • Fig. S16. A regular polygon configuration with field-of-view sensors (cf. fig. S12 for line-of-sight sensors).
    • Fig. S17. Geometry for calculating the milling radius with field-of-view sensors.
    • Fig. S18. Tank setup.
    • Fig. S19. Schematic side view of the tank showing the overhead camera and the salient parameters.
    • Fig. S20. Schematic of the tank view as seen from the overhead camera.

    Download PDF

    Other Supplementary Material for this manuscript includes the following:

    • Movie S1 (.mp4 format). Introduction to Blueswarm.
    • Movie S2 (.mp4 format). Collective organization in time.
    • Movie S3 (.mp4 format). Collective organization in space.
    • Movie S4 (.mp4 format). Dynamic circle formation.
    • Movie S5 (.mp4 format). Multibehavior search operation.
    • Movie S6 (.mp4 format). Summary and highlights.

    Files in this Data Supplement:

Stay Connected to Science Robotics

Navigate This Article