Research ArticleSENSORS

Wireless steerable vision for live insects and insect-scale robots

See allHide authors and affiliations

Science Robotics  15 Jul 2020:
Vol. 5, Issue 44, eabb0839
DOI: 10.1126/scirobotics.abb0839

Abstract

Vision serves as an essential sensory input for insects but consumes substantial energy resources. The cost to support sensitive photoreceptors has led many insects to develop high visual acuity in only small retinal regions and evolve to move their visual systems independent of their bodies through head motion. By understanding the trade-offs made by insect vision systems in nature, we can design better vision systems for insect-scale robotics in a way that balances energy, computation, and mass. Here, we report a fully wireless, power-autonomous, mechanically steerable vision system that imitates head motion in a form factor small enough to mount on the back of a live beetle or a similarly sized terrestrial robot. Our electronics and actuator weigh 248 milligrams and can steer the camera over 60° based on commands from a smartphone. The camera streams “first person” 160 pixels–by–120 pixels monochrome video at 1 to 5 frames per second (fps) to a Bluetooth radio from up to 120 meters away. We mounted this vision system on two species of freely walking live beetles, demonstrating that triggering image capture using an onboard accelerometer achieves operational times of up to 6 hours with a 10–milliamp hour battery. We also built a small, terrestrial robot (1.6 centimeters by 2 centimeters) that can move at up to 3.5 centimeters per second, support vision, and operate for 63 to 260 minutes. Our results demonstrate that steerable vision can enable object tracking and wide-angle views for 26 to 84 times lower energy than moving the whole robot.

INTRODUCTION

Vision provides a means to perceive the world at a distance; for animals, it gives crucial information used for navigation, communication, finding food, mating, and detecting threats. Animals of all sizes from insects to humans have evolved visual systems specific to their needs. In contrast, for robots, vision has been limited to larger systems, and integrating cameras onto small, insect-scale robots remains challenging (1). In this work, we sought to design a robotic vision system that is competitive with similarly sized natural systems, such as those found in insects.

A naïve approach to building a vision system for small robots would be to leverage advances in miniaturization made for smartphone cameras. This would seem an intuitive choice due to their small size and megapixel resolution (2); however, the processing and energy requirements needed to support these cameras necessitate powerful processors and prohibitively large batteries. We instead look to biology and explore the trade-offs that evolution has made in the visual systems of insects to inform the design of vision for insect-scale robotics.

The fraction of resources devoted to vision varies between animals. These range from simple eyespots composed of two cells found in zooplankton larvae (3) to complex eyes capable of seeing color and high resolution found in larger animals like humans. In an adult human, the mass of the eyes and visual cortex of the brain account for around 0.6% of body mass (4, 5). In contrast, the visual systems of insects like flies have lower resolution than large animals like humans but can consume 2.5 to 13% of body mass (6). Furthermore, supporting these large organs represents a substantial energetic cost—the retina of a blowfly alone consumes 8% of its resting metabolism (7). The energy cost to support the sensitive photoreceptors has led many insects to develop high visual acuity in only small retinal regions (810).

To compensate for a smaller visual field, insects have evolved to move their visual system independent of the body through head motion. Blowflies (10), moths (9), mantids (11), locusts (12), and many others move their visual system to expand their field of view by dynamically scanning or to maintain focus on moving objects (e.g., prey or potential mates). In contrast to moving the whole body, this adaptation allows these animals to gather more visual information in an energy-efficient manner (13). In addition, this added degree of freedom is used by some insects to guide steering as the first part of a motor program that can precede body turns (14), infer depth or motion information (911), or orient their gaze in a direction independent of their movement direction (15).

Electronic image sensors follow a similar trend to these biological eyes because simpler, lower-resolution image sensors consume less mass and less energy and require less computation. In many scenarios of interest for small mobile cameras, however, image resolution is a limiting factor. Rather than compromising resolution or field of view, we explored the approach used in nature and designed a mechanically steerable vision system that imitates head motion. We incorporated an ultraminiature piezoelectric cantilever actuator and a microfabricated lever arm to steer the camera. By incorporating steering, our system provides much higher image resolution than is possible with a wide-angle lens covering the same visual field. We further show that our actuator is substantially more efficient than moving the body of an insect or robot, minimizes impact on battery life, and allows for maintaining fixation on moving objects.

Using this approach, we developed a fully wireless mechanically steerable vision system in a form factor small enough to mount on the back of a live beetle or a similarly sized terrestrial robot (Movie 1). On live insects, a wireless first-person view represents a capability that has not been previously demonstrated. This could be used in studies of insect behavior such as how they perceive and interact with each other as well as with their environment, outside of controlled laboratory settings. For insect-scale robots, wireless vision provides a rich source of information about the shape and texture of the environment that is commonly used in larger robots but is challenging on smaller, resource-constrained platforms.

Movie 1. Overview of a mechanically steerable vision system for insects and small robots

Building steerable wireless vision at this scale is, however, challenging in practice due to the extreme size, weight, and power limitations. In addition to being small, insects and small robots have a severely constrained payload capacity. For small robots, a heavy payload requires more energy to maintain speed, reducing their operational time. Similarly, the addition of a heavy payload can limit an insect’s ability to move. For these reasons, insect-mounted sensing and control systems have not previously demonstrated wireless vision (1619). Furthermore, no previous insect-scale terrestrial robots have included wireless cameras. Larger bioinspired aerial robots like the 10-cm DelFly Micro, which weighs 3 g, have demonstrated wireless cameras, but their camera subsystem weighs more than a gram and consumes 200 mW for camera operation alone (20). Similarly, the small vision systems developed for wireless capsule endoscopy robots require hundreds of milliwatts (2123). Designing a lightweight, low-power vision system could therefore have broad applications by improving battery life across multiple domains.

The requirements of these previous vision systems are also prohibitive for payload-limited robots because their camera power consumption necessitates large, high-drain batteries and exceeds the power required for locomotion in small terrestrial robots (24, 25). Although recent advances in low-power wireless systems have shown that it is possible to reduce the power consumption for video streaming using backscatter (26, 27), these solutions currently weigh tens of grams and have a limited communication ranges of around 5 m before degradation in image quality. Other recent works on flapping wing insect robots have demonstrated small form factor lightweight cameras but required a wire tether for computing and communication (28, 29).

In this article, we demonstrate insect-scale steerable wireless vision (see Movie 2). Specifically, we make the following key contributions:

Movie 2. Accelerometer-triggered camera

A darkling beetle (E. nigrina) is held facing a metric ruler. The insect motion triggers the camera to turn and stream images to a smartphone.

1) We designed an 84-mg wireless camera system that can stream 160-by-120 monochrome video at 1 to 5 frames per second (fps) to a Bluetooth radio (e.g., smartphone) from distances of up to 120 m away using 4.4 to 18 mW of power as well as have the ability to send back commands to control the steerable head via Bluetooth in real time (see fig. S1).

2) We mounted this camera on a 35-mg mechanically steerable “head” capable of panning the camera over a range of 60°. This mechanism includes a piezoelectric actuator driven by an onboard 96-mg boost converter circuit to provide the required high-voltage signals. Furthermore, to enable low-power operation, the actuator is designed to hold its angle for over 1 min after being powered off.

3) We demonstrated real-time video streaming from the back of live insects (see Fig. 1A). We performed field experiments with two species of freely walking live beetles, demonstrating that triggering image capture using an onboard accelerometer achieves operational times of up to 6 hours with a 10-mAh battery.

Fig. 1 A mechanically steerable wireless camera mounted on a darkling beetle and a small robot.

(A) Wirelessly steerable camera system attached to the abdomen of a live darkling beetle. (B) A wireless, power-autonomous terrestrial robot with a steerable vision system. The camera can stream video to a smartphone, which can also command the robot to move and pan its camera left or right. (C) Exploded view showing all of the components of the steerable camera system including the Bluetooth chip, camera and optics, robotic head, high-voltage electronics, and battery. (D) Diagram showing the components of the mechanism used to steer the camera. (E and F) Close-up diagrams showing hinge motion as the camera pans right and left.

4) We also used this vision system to demonstrate the smallest, power-autonomous terrestrial robot with a camera (see Fig. 1B). This robot consists of a 1.60 cm–by–2 cm lightweight frame and two vibration motors and reduces the power required for locomotion (9.25 to 33.3 mW) to the same order of magnitude as its Bluetooth communication (16 mW). Our results demonstrate that to capture images across a larger field of view from the robot, moving the mechanical head is 26 to 84 times more energy efficient than moving the whole robot.

We present a wireless steerable vision system that can be carried by darkling beetles and insect-scale robots. Making this compatible with smaller insects like bumblebees and flies—with payload limits of 100 to 200 mg and 10 to 50 mg, respectively (19)—requires another order of magnitude reduction in both power and weight.

RESULTS

Low-power mechanically steerable head

Insects move their heads using complex systems of muscles, allowing for multiple degrees of freedom adapted for specific tasks (30). Replicating this with synthetic parts raises challenges in terms of the added weight, power, control, and manufacturing complexity required to create a system with many actuators. We observed, however, that well-studied insects such as Manduca sexta (hawkmoths) and Calliphora erythrocephala (blowflies) primarily move their heads along one axis with greater speed and range of motion (9, 10). Therefore, we adopted a simplified model of one degree of freedom motion.

Because of their small size, insects can carry limited payloads—hawkmoths have difficulty carrying more than 1 g (16). We therefore sought to minimize the total system weight and limit our design space to subgram components. Furthermore, inspired by our model insects, hawkmoths and blowflies, which move their heads 30° to 90° (9, 10), we set this as our target angular range for camera steering. Last, we could optimize for power without compromising angular range by selecting an actuator that can hold its angle without continuous power input. This would allow the camera to capture visual data at an angle without draining the battery.

Previous work has shown that piezo-electric actuators are an excellent candidate in terms of weight, size, and power efficiency (31, 32). Potential alternatives at this scale include shape memory alloy actuators (33) or small electromagnetic coils (34), but both are inefficient due to thermal dissipation. These piezo actuators act as capacitors with low leakage, providing a storage element for charge and allowing them to hold their state. Theoretically, the energy required to steer a piezo is simply the energy needed to charge the capacitor. Because this capacitor is on the order of several nanofarads and the required piezo voltages are on the order of hundreds of volts, the energy consumption is less than 1 mJ. In addition, because of their ultralow leakages, they can stay at the steered angle for several seconds to minutes.

The actuator consists of a three-layer, piezo-electric bimorph structure composed of lead zirconate titanate (PZT) and carbon fiber. The head is mounted within a lightweight carbon fiber frame and coupled to a hinge mechanism that translates the small deflection of the piezo to a visible angle change. The design is inspired by the mechanisms used in flapping-wing microrobots (31, 32); however, because the camera is much wider than the wings used in those systems, we inserted two carbon fiber rods between the hinge and the camera (Figs. 1, C to F, and 2A). Doing so also let us position the actuator lengthwise along the insect. To maximize the angle over which the actuator can steer the camera, we minimized the weight it has to move by mounting only the camera and lens at the tip and keeping other electronics in a fixed position below.

Fig. 2 Evaluating the wireless camera and arm performance.

(A) Labeled diagrams showing piezo actuator operation. (B) Low-voltage input current versus high-voltage output generated by the boost converter. (C) Sample 160 × 120 images showing the performance of the camera using a 1.5-mm-diameter 1-mm-focal-length lens (Edmund Optics 43394). (D) Sample 160 × 120 images showing the performance of the camera using a 3.8-mm-diameter 2.33-mm-focal-length lens (Panasonic EYLGUDM128). (E) Frame rate versus line-of-sight range from the insect. Error bars indicate mean ± 1 SD (n = 10 frames); the rate remains constant until the sensitivity limit of the wireless link, and then begins to degrade. (F) Battery life when continuously streaming 160 × 120 images at different rates, with and without robotic head motion, and different batteries. (G) Weight breakdown of the Bluetooth, camera, and robotic head, as well as each component’s percentage of total weight.

To generate the high-voltage signals (>200 V) required to drive the piezo actuator, we designed and fabricated a custom lightweight boost converter circuit (see fig. S2). The circuit topology is based on our previously reported design, which produced the required voltage for a flapping-wing aerial robot (32) but is optimized for the application of moving the camera to fixed angles using minimal energy. Figure 2B shows input current from the low-voltage supply required to produce a particular high-voltage output. We note that the current and therefore power consumption of this circuit increase exponentially. To steer the head, we charge up the piezo capacitance to our desired voltage using the boost converter and immediately turn it off. Because of the leakages of the transistors and diodes connected to the boost converter output, the piezo capacitance discharges over time. To increase the discharge time, we used ultralow leakage diodes and transistors. This enabled the piezo to hold its position without being actively charged. We identified the ON resistance of the switching transistor in the boost converter circuit as the factor limiting efficiency, and we used a low-resistance gallium nitride field-effect transistor to reduce the energy consumption at high-voltage outputs. The resulting circuit is able to move the head over an angle of 60° (see movie S1), which is within the range observed for live insects such as hawkmoths and blowflies (9, 10). The actuator can also move at angular velocities of over 1000°/s without damage similar to a blowfly, which can move its neck at 1000°/s (15).

The time it takes for the piezo to return to its initial angle depends on the leakage of the components used and the total capacitance composed of piezo capacitance and external capacitance. To increase this time constant, we placed a large external capacitor at the output. Because this element could store more charge, it could hold the angle for a longer time; a large capacitor that extends the fall time, however, also increases the rise time, requiring longer to charge it up to the maximum voltage and therefore consuming more energy. For our application, we chose a capacitance of 660 pF to minimize energy while still taking over 1 min to return to zero degrees after being powered off. This capacitance could be increased for applications that require holding the angle longer; for example, a 100-nF capacitor increases the time for the angle to return to zero after powering off to more than 5 min.

Low-power wireless camera

When designing the wireless camera, we again considered the weight constraints discussed above to fit within the payload capacity of insects. This is challenging because whereas commodity image sensors used in smartphones are as small as 5 mm in width (2), the batteries needed to support them are on the order of 5 cm or more and weigh over 100 g. Small batteries have much lower capacity, which also has a nonlinear dependence on the amount of current it supplies. For example, a subgram LiPo battery only lasts 5 min with a load of 50 mA but can last for over an hour with a load of 10 mA (35).

Existing wireless camera solutions require substantial amounts of power that would limit battery life. For example, Wi-Fi–based streaming cameras consume hundreds of milliwatts (36). Similarly, the small vision systems developed for wireless capsule endoscopy robots (2123) use small image sensors that are 3.4 mm in width but still draw 34 to 50 mA of current for the image sensor alone (see table S1). When combined with wireless communication, this leads to a total current of 65 to 115 mA. Because endoscopes do not have weight constraints, they can use heavier, higher capacity batteries. For example, the most recent untethered capsule endoscopy systems use larger 50-mAh batteries (Varta CP1254) weighing 1.6 g (22, 23), with which they achieve a 26-min battery life. Similarly the commercial PillCam uses two Energizer 399 batteries weighing a total of 1.6 g (37). Both would only last 5 min on lightweight batteries.

Recent work has also used low-power backscatter to reduce the power consumption of wireless streaming cameras (26, 27). These systems, however, can only achieve limited wireless ranges, which would limit their usability on freely moving insects or robots. To allow for robust operation outdoors in the field, we therefore targeted a line-of-sight range of 100 m. The power and range limitations of existing cameras highlight the need to design a custom wireless camera solution. In order for this custom design to easily scale and adapt to a variety of different robotic applications, we use commercially available chips rather than custom-designed application-specific integrated circuits.

We took a clean-slate design approach and determined the minimum necessary components for a streaming camera system. We then systematically identified the smallest size and lowest power commercial off-the-shelf components that meet these requirements. We began by selecting an ultralow-power 320 pixel–by–320 pixel image sensor (Himax HM01B0) that measures less than 2.3 mm wide and weighs 6.7 mg (see table S1).

Our vision system will ride aboard small insects and robots; this introduces another requirement not found in biological systems: a bidirectional wireless communication link to record data from this image sensor and control the actuator. The primary design considerations for this component are an uplink bit rate of 1 to 2 Mbps for video streaming, a small form factor, and long range. Instead of designing a custom radio and communication protocol, we leveraged the Nordic NRF52832 Bluetooth 5.0 chipset, which supports up to 2 Mbps bit rates and is available in small (3 mm by 3 mm), highly integrated packages, weighing only 6.8 mg.

Building this system, however, requires addressing the following three systems challenges. First, it requires interfacing the Bluetooth chip (Nordic NRF52832), which has limited memory and computing resources, with the image sensor in a manner that allows it to operate at the maximum frame rate. Second, the system requires a lightweight lens and connector to the image sensor. Last, it requires a small lightweight 2.4-GHz antenna for the Bluetooth chip.

To avoid using external components like field programmable gate arrays that add weight, we connected the camera directly to the Bluetooth chip (see fig. S3). To allow the Bluetooth chip to read the image sensor data directly at the highest frame rate requires a direct memory access (DMA) feature. The DMA feature, however, is only exposed for certain common communication protocols, none of which are supported by the camera. The camera only outputs data and clock signals. To read the camera data, we repurposed the serial peripheral interface (SPI) interface, which does have access to DMA. Reading data over the SPI interface requires a data signal, a clock signal, and a chip select (CS) signal. We could directly use the camera’s data and clock outputs for the first two. However, to provide the missing CS signal required by the protocol, we leveraged the line-valid output of the camera. This signal was configured to trigger an interrupt on the Bluetooth chip, which toggles an output pin to spoof the missing CS signal.

The additional components required to complete the camera assembly are a lens and connector. Commercial modules that integrate all of these components do exist (38); however, they weigh 158 mg, and we therefore built a custom camera assembly weighing only 24 mg. One approach to build a custom ultralightweight lens is to use a pinhole design; however, the small aperture required restricts the system to very bright light, on the order of 100,000 lux (28). We instead used a plano-convex lens placed over this bare image sensor. To ensure correct lens alignment and distance from the image sensor, we constructed a fixed aperture carbon fiber enclosure to house them and secured the parts together using cyanoacrylate glue (see Fig. 1C). We captured images using two lightweight lenses, a 1.5-mm-diameter 1-mm-focal-length lens (Edmund Optics 43394) weighing 4.8 mg (sample raw images in Fig. 2C) and a larger 3.8-mm-diameter 2.33-mm-focal-length lens (Panasonic EYLGUDM128) weighing 20.3 mg (sample raw images in Fig. 2D). The smaller lens is ideal for capturing images up close to the camera. The larger lens allows this camera to zoom in on large objects farther away, such as buildings or humans. Because many insects live in dark environments, we tested the camera and found that it could capture discernible images down to luminance levels of 5 lux or below (see fig. S4 and movie S2).

In addition to a lens, we also designed a custom antenna to reduce weight. Although small lightweight chip antennas are available for use at 2.4 GHz, they often require large ground planes of 3 cm or more and clearance to other components to achieve good performance. Considering that the required clearance and ground plane exceed our target form factor dimensions, we instead designed a custom antenna using an ≈5-mm-long segment of 43-AWG wire. To achieve resonance with this short structure, we incorporated a three-turn helix at the base held in shape by cyanoacrylate adhesive. Using this antenna, we evaluated wireless communication range versus video frame rate by placing our form factor Bluetooth radio at a fixed location at ground level in a grassy field and moving a receiver to increasing distances. The system was able to transmit uncompressed images up to 120 m to a Bluetooth chipset (Nordic nRF52832) connected to an 8-dBi patch antenna (L-com RE09P; see Fig. 2E) and 45 m to a smartphone (Samsung Galaxy S9; see fig. S5A). We implemented JPEG compression on the microcontroller but found that the time and power required to compress video was much longer than that needed to stream the raw data.

Fully wireless operation requires a power source in addition to a radio link. We used small LiPo batteries to power the system because they can provide sufficient current for both Bluetooth and the boost converter that dominate the system’s power consumption. We evaluated battery life when continuously streaming 160 × 120 images with the boost converter off (camera at 0° angle), which achieved battery lifetimes of 130 min and 307 min for 5 and 1 fps, respectively (Fig. 2F). Turning on the boost converter to sweep the arm over 60° in discrete 15° steps for each frame at 1 fps resulted in a battery life of 270 min (Fig. 2F).

Video streaming from live insects

We demonstrated that the form factor of this wireless vision system is compatible with live insects. Cameras have been used to study a variety of larger animals, giving biologists insights into their behavior with a first-person view. This miniaturized camera system extends that same capability to smaller animals like insects. In addition to the form factor, the wireless range of 100 m enables experiments in real natural habitats outside the laboratory. Although previous work (1618) can control insect motion and our recent work (19) equips small insects with sensors, none of these reports had been able to demonstrate a streaming video from small insects.

The natural motion patterns of insects present an opportunity for additional power savings. For scenarios in which an insect may be asleep and inactive for long periods of time, we can also put the camera system into a low-power sleep mode. We included a low-power accelerometer in our design, capable of detecting when the insect moves, that wakes up the system and begins capturing images.

To evaluate whether the accelerometer-triggered operation is a viable strategy for power saving and to demonstrate that the system can be used with live insects in the field, we attached our camera, accelerometer, Bluetooth radio, and battery (PGEB201212C) to the thorax of a death-feigning beetle (Asbolus laevis; Fig. 3A). We chose species of darkling beetles such as this and the Pinacate beetle (Eleodes nigrina; Movie 3) due to availability and because previous work has shown that they can carry the required payload (18). We note that other species like moths and spiders could also be used as long as they can carry the camera system payload. On an overcast day, beginning at dusk and ending into the night with low light conditions, we allowed the beetle to walk freely in different outdoor environments, including an overgrown parking lot, a fallen log in a grove of trees, a dry stream bed, and on a gravel road (Fig. 3B). The system was programmed to wake up from a low-power sleep mode and transmit five images to a smartphone receiver each time the accelerometer detected motion (Fig. 3C). After operating for about 1 hour in sleep mode during experimental setup, the system ran for an additional 363 min in its accelerometer-triggered mode. The number of accelerometer triggers recorded on the smartphone, which indicates the activity of the insect, is plotted versus time in Fig. 3D. In addition, the top 25 intervals between movements, showing periods of inactivity totaling 189 min, are plotted in Fig. 3E; these explain the battery life improvement compared with continuous streaming results shown earlier in Fig. 2F.

Fig. 3 Field evaluation of accelerometer-triggered camera on a death-feigning beetle.

(A) Close-up image of the wireless camera without a microrobotic arm on the back of a death-feigning beetle. (B) Experiment site with a grove of trees, a dry stream bed, and gravel paths. The beetle walked freely in four different locations. (C) Images from the camera showing a person walking. (D) Beetle motions detected by the accelerometer per minute over the 363-min experiment. The inactivity explains the improvement in battery life over continuous streaming. (E) Top 25 intervals between accelerometer triggers when the system is in sleep mode.

Movie 3. Sample real-time video streams.

The smartphone sets the resolution and starts streaming video. The inset video shows a reference view of the scene (a person walking).

We similarly evaluated the full system, including a scenario where the robotic arm pans the camera horizontally over a roughly 60° range. We attached the full system with the battery (PGEB201212C) to the abdomen of a Pinacate beetle (E. nigrina, 1.13 g; Fig. 4A and movie S3) rather than its thorax because this is the largest flat region of the body for this species. We then placed the beetle on the ground and let it wander freely in a large gravel parking lot until it reached an edge, at which point it was placed back in the center (Fig. 4B). The system was programmed to activate upon an accelerometer trigger and capture a sequence of five images (see movie S4). The robotic arm was set to move in discrete steps of 15° per image. The actuator moved to the next angle and captured an image within 690 ms, achieving a panoramic view every 3.45 s. This delay value between frames included the total time required to move the camera, read the image data, and transmit the image over Bluetooth, which included a conservative margin added to account for the higher latency observed at longer ranges with low Bluetooth signal strength. Figure 4C shows a sample panoramic view. Images were combined to create a panorama offline using standard panorama stitching algorithms. Unlike the death-feigning beetle in the previous experiment, this beetle was active almost continuously, as illustrated in the plot of accelerometer activity in Fig. 4 (D and E).

Fig. 4 Field evaluation of microrobotic arm on darkling beetle capturing panorama images.

(A) Wireless camera system with the microrobotic arm attached to a darkling beetle. The camera was set to capture five images as it rotated in 15° steps to capture a panorama. (B) Aerial view of the experiment site in a gravel parking lot showing buses and trucks in the southwest corner. (C) Panorama showing the trucks and buses composed of five images captured by the insect-mounted camera while rotating 60°. (D) Beetle motions detected by the accelerometer per minute. (E) Top 25 intervals of beetle inactivity.

We observed in both of these experiments that the insect’s motion did not cause much image distortion. This can be seen in movie S5, in which the beetle walked on a flat desk surface where the images followed the gait of the insect. In addition, as seen in movie S6, even when the beetle was traversing uneven terrain with rocks approximately equal to its height, the objects in the individual frames were still identifiable. These evaluations also indicate that the attached payload did not substantially affect the insect’s ability to navigate complex terrain.

Video streaming from insect-scale robots

The ultralightweight form factor of our steerable wireless camera can enable numerous applications in robotics. By reducing the size of the camera system, we can enable the development of more complex centimeter-scale robots that support vision. In addition, minimizing the weight of our steerable vision system reduces the robot’s payload requirements, resulting in longer battery life or faster motion.

We designed a small power-autonomous, terrestrial robot to demonstrate the potential applications of our vision system. Table 1 shows previous power-autonomous insect-scale robots (1, 25, 3942), none of which supported wireless vision and all larger in size than our robot. To design our insect-scale robot, we began by examining the options for achieving locomotion at this scale. One option is to use piezo actuators similar to the one used to move our camera. Although piezos are attractive due to their low weight, operating our boost converter to produce a continuous high-voltage sinusoidal waveform consumes over 150 mW. This high power draw requires compromising on battery size because many small batteries do not support high peak current, battery lifetime due to the limits of battery energy density, and speed because the robot must instead carry a heavier battery payload. Alternative actuators, such as eccentric rotating mass vibration motors, have been used to achieve locomotion at this scale (40, 43, 44) and do not incur the inefficiencies of producing a high-voltage output. Although these motors weigh much more than piezos (≈1 g), we observed that they can produce enough force for forward locomotion of a robot using tens of milliwatts, after our power optimizations, as opposed to the hundreds of milliwatts consumed by state-of-the-art piezo systems (41).

Table 1 Comparison with previous power-autonomous robots.

RF, radio frequency. “-X-” indicates “not specified.”

View this table:

We designed an insect-scale robot using two vertically oriented eccentric rotating mass vibration motors (Seeed Technology 1027) as shown in Fig. 5 (A and B). The motors are held 13 mm apart in a chassis constructed using two parallel sheets of 254-μm-thick fiberglass-epoxy laminate (FR4) and three 7-mm-long metal legs. These dimensions were chosen such that they are the minimum required to accommodate the 12-mm-wide lithium polymer battery mounted between the FR4 layers. All of the electronics, including Bluetooth, the boost converter, and voltage regulators, were mounted on the top FR4 layer. The piezo actuator and camera assembly were mounted above the electronics using carbon fiber rods. The robot chassis measures 1.6 cm in length and 2 cm in width.

Fig. 5 Evaluation of the insect scale wireless robot.

(A and B) Close-up images of the robot with vibration motors, three legs, a battery, Bluetooth chip, camera, and robotic head. (C) Effect of carrying payload greater than the camera system and the battery on robot speed while running the vibration motors at a constant power. Error bars indicate mean ± 1 SD (n = 5 trials). (D) Robot power consumption for different motion types and speeds. (E) Energy consumption for different angles when using the motors versus head to turn the camera. (F) Battery life for different robot speeds without video streaming, with 1 fps video streaming, and 1 fps video streaming while panning the camera.

The robot can move at speeds up to 3.5 cm/s (2.2 times the body length) with no additional payload (see movie S7). Figure 5C shows the effect of running the vibration motors at a constant power while carrying additional payload beyond the steerable wireless camera system and the battery. We note that previous vibration motor–based designs used heavier and larger coin cell batteries and have a speed of 1 cm/s (0.33 times the body length) (40). Thus, in comparison with previous work, our design achieves a 6.67 speedup when normalized to body length.

Our design also uses additional techniques to further decrease the robot’s power consumption (see the Supplementary Materials). Figure 5D shows the results of these strategies, such as the power required for motion after reducing the motor power and running only one motor at a time. This optimization in the robot power raises the question of whether the piezo-based head mechanism described above is still a more efficient mode of adjusting the camera’s field of view. Although the head mechanism presents the benefit of the ability to move the camera independently of the robot, Fig. 5E demonstrates that the piezo mechanism requires less total energy. Specifically, it takes between 26 and 84 times less energy to rotate the head than to rotate the robot body by the same angle. Because of its actuation time of a few milliseconds and ability to hold an angle without continuous power input, the energy required to move the piezo is lower than rotating the whole robot. The difference in the energy savings across angles is because of the nonlinear behavior of the boost converter, which takes longer to move the piezo mechanism to larger angles.

The battery life of running the whole robotic system is shown next in Fig. 5F. Components are added cumulatively to show the battery life reduction when adding each additional component. In these experiments, the robot is moving continuously, the camera is streaming at 1 fps, and the arm moves 30° once a second. We observe that at the lower speed of 0.5 cm/s, the power of the robot and camera is the same order of magnitude, resulting in a noticeable difference in battery life. In contrast, at its maximum speed, the robot requires 10 times more power to move its body than to capture images at 1 fps. We also note that in all cases, operating the piezo actuator requires minimal energy and therefore has little impact on battery life.

Running the streaming camera while the robot is in motion, however, introduces an additional challenge of addressing vibration. We primarily observe oscillation about the horizontal axis of the image due to the hinge of the head mechanism. The hinge and connectors were designed to provide minimal resistance to the piezo to achieve a large angle. However, the result of the low stiffness is that the vibration of the robot induces oscillation. Because the camera uses a rolling shutter, if the sampling time is not much faster than the period of the vibrations caused when moving, the robot images appear distorted (see fig. S6). One approach for minimizing distortion is to reduce the image resolution and as a result increase the frame rate. This, however, compromises image quality. A second approach would be to use the piezo mechanism to stabilize the image. This, however, would add complexity because it requires a feedback mechanism, additional computation, and potentially more power. We implemented an alternative solution of duty cycling the robot motion, effectively trading off speed to maintain image quality. To achieve an effective frame rate of 1 fps, we set a timer to run the robot for 640 ms and trigger image capture after a few hundred millisecond pause to allow any remaining vibrations to settle. This results in a trade-off between speed and frame rate. At 1 fps, the average speed of the robot is reduced to 2.24 cm/s due to the intermittent pauses and results in a battery life of 93 min. In contrast, a lower frame rate of 0.5 fps allows for an average speed of 2.87 cm/s and a battery life of 81 min. Despite a reduction from the peak speed of 3.5 cm/s, these speeds are still faster than the 1 cm/s speed achieved by previous vibration motor designs (40).

Fig. 6 (A and B) and Movie 4 demonstrate the potential for using the robot’s onboard vision capability for navigation. The robot is placed on a desk with obstacles and steered remotely by a human operator sending commands from a smartphone showing the live video stream. The live video is used by the teleoperator, in real time, to send navigation commands from the smartphone to the robot. Figure 6A shows an overhead view illustrating the path taken by the robot, and Fig. 6B shows the sample images from the robot’s camera as it travels.

Fig. 6 Navigation and focusing on another moving robot from the insect-scale robot.

(A) Overhead view of the robot’s path when a human operator uses the camera to navigate. (B) Sample images from the robot’s camera during navigation. (C) Side view showing that our wireless robot is stationary with the camera positioned to look at another wired robot marked with an arrow moving to the left. The background shows numbers incrementing from right to left to indicate camera motion. (D) A top view of the scene shows the moving robot going across the field of view of the camera. The camera is rotated in discrete steps to maintain focus on the moving robot. (E) Images captured by the robot at each angle.

Movie 4. Navigation of an insect-scale robot using wireless vision.

An insect-scale robot uses its camera to navigate around obstacles. The camera streams video to a smartphone, allowing a human to steer the robot.

Figure 6 (C to E) and Movie 5 demonstrate an additional application of using the piezo-driven head to focus on a moving object. In this video, the robot itself remains in a fixed position, but another similar-sized robot moves in a straight line in front of it. As the moving robot exits the field of view of the camera, a command is sent by a human operator to move the piezo to the left to track the moving robot. The background of the image includes a series of numbers that increment from right to left to provide an additional reference showing that the camera has moved. This shows the feasibility of using our steerable wireless camera to maintain focus on moving objects and robots, without the need to move the whole robot body.

Movie 5. Focusing on moving objects

A stationary robot streams video of another robot moving across its field of view. A human operator steers the camera to maintain focus on the moving robot.

DISCUSSION

Here, we describe a fully wireless mechanically steerable vision system in a form factor compatible with small robots and live insects. Our design draws parallels to biological insects in multiple ways; it uses a commercial image sensor with over 100,000 pixels in a 2 mm–by–2 mm area compared with the 4000 to 5000 imaging elements in a blowfly (45). In addition, like many insects that have limited color perception (46), we used a monochrome camera to reduce communication bandwidth and optimize for power consumption. Many insect vision systems have a flicker fusion frequency of 200 Hz or less (47). Although our current frame rate is limited by the data rate of our wireless link, our microcontroller can read an 80 × 60 image from the camera at 200 Hz. This opens up the possibility for using the vision data on-device for control. Insects of the Diptera family (flies) use between 1 and 13% of their mass for vision, including the cornea, retina, and optic lobe of the brain (6). In a blowfly, for example, the visual system consumes 2.5% of the animal’s total mass. For comparison, the image sensor and Bluetooth-enabled microcontroller consume 4.8% of the total mass of our robot. Our wireless video transmission could also provide precise position feedback, for example, by combining it with accelerometer data to perform vision-based simultaneous localization and mapping (SLAM). Although we demonstrate integration on a vibration motor–based robot, given its low mass, our steerable camera system could also be integrated on other insect-scale robots (41). Given the low weight and power requirements for our Bluetooth video streaming subsystem, it could also enable wireless video streaming applications for aerial microrobots, which currently lack this capability (31, 32). Furthermore, this work complements efforts to steer the gross muscle motions of insects through neural input (1618): It adds the capability of controlled actuation along application-specific degrees of freedom. In addition to steering a camera for a wider field of view, incorporating actuators could enable a much richer interaction with the environment.

The primary bottlenecks to further scaling down our vision system are the energy density of batteries and power consumption. One approach to building a system small enough to fit on smaller insects like bumblebees would require an improved battery technology with higher energy density; however, this could also be achieved by reducing the power required for the vision system by an order of magnitude. The addition of a lightweight solar cell to supplement the battery could also make this feasible. Scaling down further to the weight that insects like blowflies could carry would require more substantial improvement of all system components. At this scale, even the camera and lens weight of 24 mg become important. The weight of the optics could potentially be reduced using emerging metasurface technologies (48). In addition, the size of the image sensor itself could be reduced by developing a custom image sensor with smaller pixel size or lower resolution. Reducing resolution would also be advantageous from a power perspective because this reduces the time the radio transmits. The steering mechanism can be scaled down by further optimizing the high-voltage piezo driver. This requires scaling down the weight of the inductor and its power consumption, which is challenging. Another approach is to explore improved actuation mechanisms that could operate at low voltage.

Vision serves as an important sensory input for both animals and robotic systems. Whereas vision has traditionally been reserved for physically larger robots or animal monitoring systems, we built a steerable wireless vision system that is integrated on live insects and insect-scale terrestrial robots. Furthermore, imitating nature, we demonstrated the ability to move the visual system of insect-scale robots independent of their body, allowing them to gather more visual information in an energy-efficient manner. Given the importance of vision for navigation and communication, this work represents an important milestone in realizing autonomy for insect-scale robotics.

MATERIALS AND METHODS

Powering the system

The camera and Bluetooth require an average of 6 mA when streaming at 5 fps and transmitting at 0 dBm. However, the vibration motors require an average current of 9 mA when the robot is moving at its maximum speed of 3.5 cm/s. This current is the primary constraint when selecting a small battery because many are designed for low-drain operation (100 μA). For example, a 1-mAh lithium manganese battery (Seiko Instruments MS412FE-FL26E, 70 mg) can only run the camera for 30 s and cannot start the motors on the robot. To operate the full system continuously for over an hour, we selected one of two rechargeable LiPo batteries (PowerStream GM300910 or PGEB201212) that supported our current requirements and had capacities of 10 to 12 mAh (see Fig. 2F for comparison).

Accelerometer-triggered operation

The system can operate in a low-power mode that maintains an active Bluetooth connection but turns off the camera and other peripherals to reduce power consumption by 89% compared with the power required during video streaming. We leveraged this mode to achieve longer battery life by using an accelerometer to trigger image capture only at times of interest, such as insect motion. The onboard three-axis accelerometer (mCube MC6470) consumes <300 μW and occupies only 2 mm by 2 mm. The accelerometer sends data over an inter-integrated circuit bus (I2C) and provides an interrupt functionality with programmable thresholds to detect motion that can be used to trigger the Bluetooth chip to capture images. This accelerometer also includes a magnetometer that could be used to provide additional information about the orientation of the insect.

SUPPLEMENTARY MATERIALS

robotics.sciencemag.org/cgi/content/full/5/44/eabb0839/DC1

Text

Fig. S1. Smartphone interface.

Fig. S2. Boost converter circuit schematic and drive signal waveforms.

Fig. S3. Full system block diagram.

Fig. S4. Sample images comparing lenses.

Fig. S5. Frame rate versus distance and resolution using Bluetooth.

Fig. S6. Distortion during motion.

Table S1. Comparison table for image sensors.

Table S2. Frame rate versus range at 1 Mbps.

Table S3. Frame rate versus range at 2 Mbps.

Table S4. Robot speed versus increasing weight.

Movie S1. Camera motion angle measurement.

Movie S2. Automatic light level adjustment.

Movie S3. Smartphone-controlled camera steering.

Movie S4. Capturing a panorama.

Movie S5. Video from beetle walking on flat surface.

Movie S6. Free walking beetle in the field.

Movie S7. Robot speed when carrying payloads.

REFERENCES AND NOTES

Acknowledgments: We thank A. Straw, D. Fox, T. Daniel, M. Katanbaf, and S. Kaplan for their feedback. Funding: The researchers are funded by a Microsoft fellowship and NSF. Author contributions: V.I. fabricated and assembled the hardware; handled and cared for the insects; developed Android software; and designed, built, and characterized the robot; A.N. developed the embedded software; V.I. and A.N. conducted the experiments, characterized system performance, and optimized robot power consumption; A.N. and V.I. designed the circuit and drive waveforms with input from J.J.; A.N., V.I., and J.J. generated figures; V.I., A.N., S.G., and S.F. designed the experiments. V.I., A.N., and S.G. wrote the manuscript; S.G. and S.F. edited the manuscript. Conceptualization: S.G. and V.I. Competing interests: S.G. is a cofounder of Jeeva Wireless, Edus Health, and Sound Life Sciences. Data and material availability: All data needed to evaluate the conclusions of the paper are available in the paper or the Supplementary Materials. Source code is available at https://github.com/uw-x/insect-robot-cam.
View Abstract

Stay Connected to Science Robotics

Navigate This Article