Research ArticleANIMAL LOCOMOTION

Automatic tracking of free-flying insects using a cable-driven robot

See allHide authors and affiliations

Science Robotics  10 Jun 2020:
Vol. 5, Issue 43, eabb2890
DOI: 10.1126/scirobotics.abb2890

Abstract

Flying insects have evolved to develop efficient strategies to navigate in natural environments. Yet, studying them experimentally is difficult because of their small size and high speed of motion. Consequently, previous studies were limited to tethered flights, hovering flights, or restricted flights within confined laboratory chambers. Here, we report the development of a cable-driven parallel robot, named lab-on-cables, for tracking and interacting with a free-flying insect. In this approach, cameras are mounted on cables, so as to move automatically with the insect. We designed a reactive controller that minimizes the online tracking error between the position of the flying insect, provided by an embedded stereo-vision system, and the position of the moving lab, computed from the cable lengths. We validated the lab-on-cables with Agrotis ipsilon moths (ca. 2 centimeters long) flying freely up to 3 meters per second. We further demonstrated, using prerecorded trajectories, the possibility to track other insects such as fruit flies or mosquitoes. The lab-on-cables is relevant to free-flight studies and may be used in combination with stimulus delivery to assess sensory modulation of flight behavior (e.g., pheromone-controlled anemotaxis in moths).

INTRODUCTION

Despite their miniaturized brain, insects are not simple reflex automata; instead, they exhibit a rich behavioral repertoire. Drosophila melanogaster, for example, a tiny insect weighing half of a milligram, has only 100,000 neurons—a million times less than the human brain—and this reduced number of neurons does not prevent the fly from sensory processing and flight maneuvers that are unmatched with current technology (1, 2). Understanding how miniaturized insect brains control sensory processing and flight behavior could serve as a source of inspiration for future developments in robotics, e.g., micro-aerial vehicles mimicking flapping flight at the insect scale (3, 4) or olfactory robots inspired by odor tracking in moths (5).

For decades, researchers have developed laboratory experimental setups to study and understand the flight behavior of insects. There are two approaches: one consists of maintaining the insect in position so that flight kinematics can be analyzed in great detail with high-speed cameras, and the other considers more natural conditions, that is, free flight.

Restraining animal movements can be done in two ways: by taking advantage of the specificity of particular insects, e.g., the ability for hawkmoths to hover in front of artificial flowers while feeding on nectar (Fig. 1A) (6, 7), or by physically constraining the insect with a tether, e.g., (8) (Fig. 1C). Both techniques have critical limitations. The former is restricted to studying a steady condition, that is, hovering flight, whereas a rigid tether affects flight dynamics, as it does not allow roll or pitch motions.

Fig. 1 Experimental setups for studies with flying insects.

(A) Hovering flight of a Manduca sexta (100-mm wingspan) in front of an artificial flower (6, 7). Photo credit: Kiley Riffell Photography. (B) Virtual reality flight arena (Drosophila melanogaster) (12). Photo credit: IMP/IMBA Graphics Department, https://strawlab.org/freemovr. (C) Tethered moth (Agrotis ipsilon) from our lab. Photo credit: Patrice Latron Photography. (D) Lab-on-cables (A. ipsilon moths) from our lab. All photos used with permission.

In free flight approaches, a few studies attempted to record muscle activities with miniature electrophysiology devices carried by the insect during flight (9). Yet, they have been of limited success because the carried load affects flight performance. This approach is thus restricted to the largest insects (e.g., hawkmoths with body mass from 1 to 3 g) that have the capability to carry loads up to 10% of their weight. Wind tunnels (ca. 1 to 2 m long and 30 cm wide) equipped with motion capture are common tools to record the trajectories of insects in free flight. They have been used to characterize the attractiveness of insects in turbulent odor plumes, such as the anemotactic response of male moths to sexual pheromone; see reviews (10, 11). Flight paths are typically zigzags along the wind axis with decreasing amplitude as the insect approaches the source. Yet, these are only average or indirect observations because wind tunnels do not allow the precise delivery of stimuli in space and time, and the use of external cameras positioned far away from the insect prevents any analysis of flight behavior. Consequently, synchronous measurement of the airborne cues actually encountered by the animal at any point of its displacement and of the animal orientation toward the source is difficult to achieve. Recently, Stowers et al. (12) used virtual reality to manipulate the optomotor response of the insect so that the flight is kept within a limited volume and can be tracked with motion capture (Fig. 1B).

Thus, the vast majority of experimental research on insect flight has been limited to the study of hovering flights (Fig. 1A), tethered flights (Fig. 1C), or restricted flights in confined environments (Fig. 1B). Here, we report a cable-driven robot, named lab-on-cables, which endeavors to overcome these limitations. In our approach, the lab equipment is mounted on cables so that it can move along with the animal (Fig. 1D). Cable robots belong to a special class of parallel robots in which rigid links are replaced by flexible cables (13). They offer several advantages, such as the possibility of moving objects with high precision in a large workspace. Famous applications are the SpiderCam or SkyCam (14), a suspended camera moving over stadiums, and the RoboCrane (15), a robotic crane used in construction sites. Cable robots have also been used as motion simulators for humans (16) and motion generators in wind tunnels (17). The task of moving lab rigs along with a flying insect is challenging because the insect trajectory is not known in advance and the flight speed can be high (several meters per second) (18). Here, we describe the design of the lab-on-cables that fulfills this task and its experimental validation.

RESULTS

Lab-on-cables implementation

We built the lab-on-cables, a 6–degree of freedom (DOF) cable robot that tracks flying insects in a (6 m long by 4 m wide by 3 m high) workspace (Fig. 2A and Movie 1). Here, we sketch the main ideas of the design and refer the reader to Materials and Methods for the details. The end effector, called flying frame, is an open cube (edge length of 30 cm) in which the insect can fly freely. To follow the insect trajectory, we mounted the flying frame on cables operated by motorized winches (Fig. 2, B and C). An optical system, which computed online the three-dimensional (3D) position XT of the insect using infrared (IR) illumination and calibrated cameras, was integrated onto the flying frame. A control strategy that consists of chasing the current target position is risky and can lead off the track because insects may fly at a relatively high speed. Instead, the control scheme takes into account the direction of motion of the insect to anticipate what its future location would be and points toward it (Fig. 2D). This strategy can be seen as a type of deviated pursuit used in missile guidance (19). One major difference, however, is that a missile aiming at target interception flies at its maximum speed, whereas the lab-on-cables tracking an insect adjusts its speed continuously. The tracking speed can be described as followsV=Kp (XTX)+KdX.T(1)where X is the position of the robot (center of flying frame), X.T is the insect velocity (time derivatives are indicated with a dot throughout the paper), and Kp and Kd are proportional and derivative gains.

Movie 1. Lab-on-cables at 1 m/s superimposed with long-exposure photo of robot movement..
Fig. 2 Lab-on-cables setup.

(A) Photo of the cable robot (6 m by 4 m by 3 m) and (B) schematic view of the setup. (C) Photo of the flying frame (30 cm by 30 cm by 30 cm). The flying frame supports the lab equipment, i.e., an IR source and a pair of calibrated cameras (Pixy cam 1 and 2), for online insect location. The flying frame moves automatically to keep the insect within the detection range of the cameras. (D) Robot control as deviated pursuit. Robot and insect locations are X and XT, respectively. The insect speed is denoted X.T. The tracking speed V is the sum of a pure pursuit term pointing along the line of sight (LOS) toward the target current location and a corrective term, taking into account the direction of motion of the target. This is like anticipating where the target will be to point ahead of the target and cover less distance.

Experimental validation of control design

We first performed robotic experiments with insect trajectories that had been previously recorded in wind tunnels (20, 21). The trajectories corresponded to various insects flying in attractive odor plumes: fruit flies (Drosophila melanogaster, n = 169 insects) with ethanol plume (20), mosquitoes (Aedes aegypti, n = 65 insects) with CO2 plume (20), and moths (Agrotis ipsilon, n = 38 insects) with pheromone plume (21). The objective in replaying prerecorded trajectories was twofold: to conduct pretests with different insects and to bypass the embedded detection system to validate the control design of the robot.

Figure 3 (A to C, left) shows three examples of trajectories tracked by the robot (control gains Kp = 8.4 and Kd = 1). The robot was programmed to reach, every 10 ms, a new target position that was read sequentially from the data file of prerecorded trajectory. To test the effectiveness of the robot, we computed the tracking error for all points in all trajectories as the Euclidean distance between the target (flying insect) and the end effector (center of flying frame). Overall, the tracking error increased with the speed of the insect, so that small (or large) errors were made at low (respectively high) speed (Fig. 3, A to C, right). The connected points in Fig. 3 (A to C, right) indicate the evolution of the tracking error along the trajectories. We noted, in some trajectories, that large residual errors experienced at high speed persisted from one operating cycle to the next (i.e., 10 ms) when the insect slowed down. This observation suggests some limitations for the robot in terms of speed and acceleration. To assess these limits, we used speed-step control inputs and analyzed the transient response of the robot. We estimated the robot limitations to 3.6 m/s and 17 m/s2.

Fig. 3 Tracking prerecorded insect trajectories.

(A to C) Left: Examples of insect trajectories (red curves) versus robot trajectories (black curves). Right: Tracking errors of the robot (in centimeters) versus flight speed of the insect (in meters per second). (D) Cumulative distributions of the tracking error for experimental data (plain curves) and theoretical gamma distribution (dashed curves). (E) Tracking errors of the simulated model (Euler integration of Eq. 1) versus flight speed of the insect (moth data).

For all species, the error distribution is well described by a gamma distribution with fitted shape and scale parameters. The cumulative distributions plotted in Fig. 3D for the various insects indicate that, in more than 90% of the cases, the tracking error was less than 1 cm, which is small compared with the length of the flying frame (30 cm). The experimental tracking errors could be reproduced in simulation by incorporating the speed and acceleration limits into a simple model of the robot (compare Fig. 3, E and C, right). The simulated model merely consisted of integrating Eq. 1 using Euler method. The fact that Eq. 1 can account for the behavior of the robot validates the control design. Among the three species, the trajectories of A. ipsilon moths are the most difficult to follow; see intermittent large errors (ca. 10 cm) due to high flight speeds (ca. 5 m/s) in Fig. 3C (right). In the next section, we use the lab-on-cables to track free-flying A. ipsilon moths.

Tracking free-flying A. ipsilon moths and analysis of flight kinematics

A prerequisite to tracking flying insects is the takeoff procedure depicted in Fig. 4A (see also movie S1). The insect is gently placed onto a takeoff platform (2 cm2) positioned near the center of the flying frame, and the robot starts tracking immediately. If spontaneous takeoff does not occur within 5 min, insect flight is initiated by thermal stimulation (Peltier element on the takeoff platform). After takeoff, an electromagnet allows the fall of the platform in order not to hamper the tracking of the insect by the robot. We did not notice any difference between spontaneous takeoffs and those provoked by thermal stimulation. In either case, A. ipsilon moths jumped from the platform by using both their wings and their legs to power the takeoff. This flight initiation is similar to one of the jumping strategies identified for moths in (22).

Fig. 4 Tracking A. ipsilon moths.

(A) Takeoff procedure. (B) Tracking of free flights (n = 32 moths). Top: Flight length versus flight duration fitted by linear regression (Pearson correlation R2 = 0.79). The colored points are for the six moths used in the analysis of the flight kinematics. Bottom: Cumulative density function (CDF) of the insect flight speed during tracking.

We performed tracking experiments with n = 32 moths. During flight, the insect is located online from the embedded optical system, and therefore, in contrast to previous experiments, the target position is noisy. To improve stability in the presence of noise, we reduced the proportional and derivative gains to Kp = 3 and Kd = 0.9. The flight trajectories ranged from 0.5 to 3 m (Fig. 4B). The cumulative distribution in Fig. 4B indicates that the insect speed during flight is lower than 3 m/s in 99% of the cases.

For n = 6 moths (shown as colored points in Fig. 4B), we recorded images at 400 frames per second (fps) (i.e., 10 times the wingbeat frequency of A. ipsilon moths) with a high-speed camera mounted on the flying frame (Fig. 5A, left). To extract the body kinematics from the video sequence, we modeled A. ipsilon moth in 3D using Blender, a computer graphics software (Fig. 5A, right). The model was rigged to be fully articulated; i.e., the position and rotation of body and wings could be set freely under the constraint of symmetry between the left and right wings. The position and rotation of body and wings were optimized semi-automatically, so as to match those of the real insect in each image of the recorded video sequences (see Materials and Methods). From the fitted model, we extracted the body kinematics represented by the body angle α and the stroke plane angle β with respect to the horizontal plane, as depicted in Fig. 5A (right). The stroke plane was identified by least square fitting from the positions of the wing base and tip during an entire wing beat (i.e., approximately 10 consecutive 3D points).

Fig. 5 Analysis of flight kinematics.

(A) A high-speed camera is mounted on the flying frame, and the pose of the 3D model of the moth is optimized semi-automatically to match the one of the real insect in the recorded videos. The inclination of the body is represented by the body angle α, i.e., the angle between the longitudinal axis of the body and the horizontal plane. The angle β is between the horizontal and the stroke plane defined by the positions of wing base and tip during an entire wing beat. (B) Wingbeat frequency versus flight speed. (C) Body angle versus flight speed. (D) Stroke plane angle versus flight speed. In (B) to (D), the mean values ± SE represent averages within speed intervals (bin size, 0.5 m/s). (E) Body angle versus stroke plane angle.

We analyzed the kinematics for six moths representing a total of 154 wingbeats (Movie 2). The wingbeat frequency as well as body and stroke plane angles change with the flight speed. As flight speed increased from 0 to 3 m/s, the wing beat frequency increased from 39 to 47 Hz [P < 0.05, one-way analysis of variance (ANOVA); Fig. 5B] while the insect body tended to be more horizontal, with α decreasing from 59° to 15° (P < 0.001, one-way ANOVA; Fig. 5C). The opposite is observed in Fig. 5D for the stroke plane angle, with β increasing from 25° to 57° (P < 0.001, one-way ANOVA). Body and stroke plane angles are not independent. We note in Fig. 5E that α = − β + 82° (Pearson correlation R2 = 0.74), so that the angle α + β between the stroke plane and the longitudinal axis of the body is approximately constant. These results are consistent with those obtained in (6), albeit in different situations: hovering moths in the presence of a steady air flow versus moths flying freely with variable acceleration and repeated changes of direction in our study.

Movie 2. Tracking A. ipsilon moths with extraction of flight kinematics by 3D model–based matching (real moth in white and 3D model in pink).

DISCUSSION

We validated the lab-on-cables with A. ipsilon moths flying freely up to 3 m/s and measured the moth kinematics with an on-board high-speed camera. The data analysis indicates that, when flight speed increases, A. ipsilon moths pitch their body down while the stroke plane becomes more vertical, as body and stroke plane angles vary in accordance. This result is consistent with a helicopter model of insect flight (6, 23, 24), whereby flight speed is controlled by body pitch via changes in the stroke plane angle.

Measurements of insect flight with high-speed cameras are usually hampered by the trade-off between spatial resolution and field of view, which requires the insect to stay in the vicinity of the camera system. The insect is thus traditionally kept within the detection range (typically a few tens of centimeters) by imposing some sort of movement restriction, whether by means of a tether or by confining the flight space. With the lab-on-cables, the insect evolves freely in an open space, albeit with a detection range that is constrained by the size of the workspace. In its present form, the lab-on-cables was located indoor in a relatively small room, which limits the operational workspace; however, because it relies on a lightweight cable structure and mechanically simple winches, one can envisage rescaling the design to perform experiments at a larger scale and even in the wild.

When studying flying insects, we also need to bear in mind that experimental setups may lead to unnatural flight behavior. This is known for tethered insects, which tend to exaggerate their movements because they do not support their own body mass, and for insects flying in confined environments, which modulate their speed according to the distance from the wall (18). The lab-on-cables is not likely to alter the flight kinematics, because our results are consistent with previous studies in free flight. Yet, it could be argued that the flying frame might still be considered a threat by the insect. However, it is worth noting that the flying frame moves with the insect, so the situation is different from a looming stimulus produced by a rapidly approaching predator. If the flying frame had been perceived as a threat, evasive maneuvers would have been observed in the flight of moths, and our experiments do not support this claim. The typical escape behavior in moths, which consists in cessation of flight and dropping to the ground, was not observed in any of the 32 moths tested.

We are not aware of any apparatus or concept comparable to the lab-on-cables, which has the potential to become an important tool in the study of flying insects. The lab-on-cables uses the advantages of cable robots to track flying insects, namely, very fast dynamics with little air disturbance and the possibility to be deployed over a large workspace, thanks to the lightweight cable structure. Moreover, taking other equipment on-board would allow for stimulus delivery during flight so that one could then study sensory-driven behaviors in an unprecedented manner. A typical example would be to study pheromone-controlled anemotaxis in moths with a precise control of the olfactory cues during their flight. Despite the advantages of the lab-on-cables, there is still room for performance improvement in this technology. For example, the use of cameras with increased spatial and temporal resolution may prove beneficial to limit noise in the estimation of the target position. Similarly, the use of more powerful motors can push forward the maximum attainable limits in speed and acceleration (currently estimated at 3.6 m/s and 17 m/s2, respectively). Such technological adaptations are feasible, because ultrafast cable robots have been reported [e.g., the Falcon robot with velocities and accelerations up to 13 m/s and 430 m/s2, respectively (25)], and thus would be useful as future additions.

MATERIALS AND METHODS

Hardware design

We built the cable robot partly from hardware provided by Haption (26). The base frame is a parallelepiped of dimensions 6 m by 4 m by 3 m (Fig. 2, A and B). The robot with eight cables for 6 DOF is overconstrained. The cables are driven by motorized winches (Maxon DC motor RE65 with COMBIPERM P1-03 brake) located at the corners of the base frame. The cables have anchor points at the vertices of the end effector (flying frame) (Fig. 2C). As in previous designs [e.g., (27)], the cables are crossed in a way that increases the stiffness of the system. Because no cable goes through the flying frame, it represents a safe environment for the insect to fly in. The flying frame supports laboratory rigs, that is, a stereo-vision system for online insect tracking.

The IR filter was removed from all cameras, and scene illumination was provided by an IR source. The IR source did not disturb the insect because its intensity spectrum (Fig. 6, inset), measured with a spectrometer (Ocean Optics STS-NIR, 650 to 1100 nm), is well beyond the detection range of insect photoreceptors (28). For online insect tracking, we embedded two calibrated Pixy cameras (CMUcam5) that captured images and extracted the 2D pixel coordinates (effective resolution of 320 × 200 pixels) of the insect at 50 fps. The 2D pixel coordinates of the two cameras were transmitted wirelessly to the control computer using Arduino UNO and Xbee. The 3D triangulation was performed based on a least-square fit of the intersection between the two lines derived from the 2D pixel coordinates and the 3D camera centers. The triangulation result was filtered with a constant velocity Kalman filter (29) to provide the 3D position (xT, yT, zT) of the target (insect).

Control design

The control design of the robot follows classical methods for cable robots, as described in (13). The control scheme is outlined in Fig. 6 and consists of four steps.

Fig. 6 Control scheme.

After insect detection by the IR optical system (normalized intensity spectrum provided in the inset), the control scheme of the robot consists of four steps: (i) estimation of the robot pose by using forward kinematics, (ii) computation of the tracking speed to minimize the tracking error, (iii) transformation into a winding speed vector by using the Jacobian and inverse kinematics, and (iv) conversion to motor command with constraints on the cable tensions. The vectors Im = (I1m, ⋯, I8m)T, ℓm = (ℓ1m, ⋯, ℓ8m)T, and vm = (v1m, ⋯, v8m)T are measurements of the motor currents, of the winding/unwinding speeds, and of the cable lengths, respectively.

First, the combination of position and orientation of the flying frame—referred to as the pose, X = (x, y, z, α, β, γ)T, with (α, β, γ) representing yaw, pitch, and roll angles, respectively—is estimated from the cable lengths by solving the forward kinematic problem. This approach is preferable to a motion capture system because it does not rely on markers, i.e., retroreflective materials placed on the flying frame. Thus, it covers a larger workspace with increased robustness (no failure due to occluded markers as in optical motion capture systems). To quantify the positioning error, we estimated the position of the flying frame from the cable lengths at each time step on random walk trajectories and compared it with ground truth data provided by a motion capture system (Qualisys Mocap 6 cameras Oqus 700+). The positioning error was 1.5 ± 0.6 cm, which is small compared with the length of the flying frame (i.e., 30 cm).

Second, the tracking speed is determined by a controller acting on the tracking error, as defined by the distance between the target (flying insect) and the end effector (center of flying frame). Here, we consider that the target pose is XT = (xT, yT, zT,0,0,0)T, because the aim of the controller is to track the insect position while maintaining a zero orientation (flying frame aligned to base frame). The tracking speed is given by Eq. 1 with diagonal matrices Kp = diag (Kp, Kp, Kp,1,1,1) and Kd = diag (Kd, Kd, Kd,0,0,0). The proportional and derivative gains Kp and Kd have been set by using Ziegler-Nichols initial estimation (30) followed by manual fine tuning.

Third, the inverse kinematic model allows us to convert the tracking speed into a (winding/unwinding) speed vector for the winches that can be written as ·=J V, with J being the Jacobian matrix. The Jacobian matrix can be seen as a local linearization of the system at the current pose X. Its expression is given in supplementary text.

Fourth, the winding/unwinding speed vector is converted into a motor speed command with constraints on the cable tension. It is necessary to ensure a sufficient tension in the cables to prevent sagging (as forward and inverse kinematics consider taut cables) while avoiding, at the same time, an excessive tension that can lead to cable breaking. This optimization problem is framed as a Quadratic Programming problem (see supplementary text). The motor speed command is sent to the motorized winches every 10 ms.

The methods used for solving the forward and inverse kinematics problems in the lab-on-cables and determining the suitable tension in the cables are derived from a textbook (13). They are detailed in supplementary text.

Extraction of flight kinematic variables by 3D-model-based matching

For n = 6 A. ipsilon flights, we recorded images at 400 fps with an embedded high-speed camera (FPS 4000 with 22-mm lens). To extract flight kinematics from the video sequence, we modeled A. ipsilon moths in 3D using Blender (Fig. 5A). The model was rigged to be fully articulated; i.e., the position and rotation of body and wings could be set freely. We also modeled the flying frame and used the same parameters (position and focal length) for the Blender camera as those of the high-speed camera. The video sequence was also preprocessed by increasing brightness and contrast. All these adjustments ensured a relatively realistic render that accurately matched the real video sequence. The next step was to achieve semi-automatic shape matching between the 3D model and the real moth in the video sequence. To process the video sequence, we applied the 3D position of the insect provided by the Pixy cameras to the model every 10 frames. These positions were then manually corrected if needed, and the remaining frames were interpolated (Bézier). The rotation of the body and wings was set manually until the superimposed model matched the insect in the frame. We considered that the wings always remain symmetrical. The flying frame was also animated to reproduce the real robot movement. Once matching was achieved, the whole video sequence could be rendered and model parameters could be extracted (positions of the head, tail, center of mass, and base and tip of both wings); see Movie 2.

SUPPLEMENTARY MATERIALS

robotics.sciencemag.org/cgi/content/full/5/43/eabb2890/DC1

Text S1. Details on the control of the robot.

Fig. S1. Geometry of the lab-on-cables.

Movie S1. Insect take-off.

References (31, 32)

REFERENCES AND NOTES

Acknowledgments: We thank M. Renou for providing the prerecorded moth trajectories and S. Sivakumar for help on visual target detection. Funding: This research was supported by the French research program CPER Cyberentreprises (2015–2020), with participation from the Lorraine region and FEDER, and by a PEPS INS2I project from the CNRS. Author contributions: R.P., M.B., and D.M. designed the robot. R.P. and M.B. designed the control. M.J. designed the 3D model of the moth. R.P., P.L., and D.M. conducted the experiments. R.P., M.J., and D.M. analyzed and interpreted the data. R.P, M.J., and D.M. wrote the original draft. R.P., M.J., M.B., P.L., and D.M. revised the manuscript. M.B. and D.M. provided funding and supervised the project. Competing interests: The authors declare that they have no competing interests. Data and materials availability: All data needed to evaluate the conclusions in the paper are present in the paper or the Supplementary Materials.

Stay Connected to Science Robotics

Navigate This Article