Shazam4Bats (aka Echo Box)

Megan Zimroth

Shazam4Bats (aka Echo Box)

University College London, Centre for Advanced Spatial Analysis (CASA)

AWARD AMOUNT

2000 USD

Map loading…

The Connected Environments Lab is a program with the Centre for Spatial Analysis, located within the University College of London. The lab thinks about the science of cities and how data can help us understand our complex environment. They focus on the research challenges that relate to the infrastructure required to connect built and natural environments from an end-to-end perspective.

The “Shazam4Bats” project was a technical collaboration between CASA and Bat Conservation Trust meant to improve the technical capability of the first prototype of the Echo Box bat monitor developed in 2017 by Intel and UCL (Gallacher, Sarah, et al. “Shazam for bats: Internet of Things for continuous real‐time biodiversity monitoring.” IET Smart Cities 3.3 (2021): 171-183) as well as establish it as an open-source hardware device. 

Inside each original Echo Box was an Intel Edison with Arduino breakout, plus a Dodotronic Ultramic 192K microphone. To capture, process and identify bat calls each Echo Box performs the following 4 steps (text adapted from Echo Box project web page):

  • First – a microphone on each device, capable of handling ultrasonic frequencies, can capture all audio from the environment up to 96kHz. Most bats calls occur at frequencies above 20kHz (the limit of human hearing) with some species going as high as 125kHz (although none of these species are found in the park).
  • Second – every 6 seconds, a 3 second sample of audio is recorded and stored as a sound file. This means that audio from the environment is captured as 3 second snapshots at a consistent sample rate across all smart bat monitors.
  • Third – the recorded audio is then turned into a spectrogram image using a method called Fast Fourier Transform. The spectrogram image shows the amplitude of sounds across the different frequencies over time. Bat calls can clearly be seen on the spectrogram as bright patterns (indicating a loud noise) at high frequencies.
  • Finally – image processing techniques, called Convolutional Neural Networks (CNN), are applied to the spectrogram images to look for patterns that resemble bat calls. If any suspected bat calls are found in the image, then we are working towards applying the same CNN techniques again to each individual bat call to look at its shape in more detail and determine what species of bat it most likely is.

This Phase 1 Collaborative Development Project originally aimed to:

  1. Redesign the bat monitor to use Raspberry Pi, which involved developing a software application to run a bat classification model on a single-board computer.
  2. Implement an AudioMoth USB to provide a lower-cost and open hardware device to capture the soundscape (in addition to the Dodotronic microphone).
  3. Extend data transmission capability to LoRa.
  4. Document the hardware components and firmware, and draft the operation manual.
  5. Improve the CNN detection algorithm so that it not only identifies bat calls but also bird species. Test the accuracy of the deep learning algorithm when deployed on the device. The expansion to bird species was later dropped as a goal.
  6. Outdoor deployment of both the software and hardware to record, detect, and classify bat species.

The funds given by GOSH have gone a long way to bringing Shazam4Bats into the open-source ecosystem although the project changed direction since its inception. As the project progressed towards version 1, it became clear that developing a flexible and modular software application running on readily available hardware like Raspberry Pi was more suited to achieving the project’s goal of developing an open-source tool for bioacoustic monitoring in the short term than switching directly to an open hardware platform. 

This resulted in Shazam4Bats’ project focusing on developing a software application called Acoupi and testing the implementation of Acoupi on Raspberry Pi to record and detect bat calls and classify bat species. 

In Summer 2023 the device was installed over 10 hours, approximately 2 hours at each of five different locations in England and Wales to detect as many bat species as possible as part of an MSc project. In total, 11 species were recorded during deployment and the accuracy of open-source model was concluded to be precise. 

The report highlighted several improvements and recommendations to include power supply, calibrating the sensors, connecting to a stable data transmitting network, redesigning the enclosure box to be more robust and compliant with International Protection Rating (IPR).

In the future, the project team plan to improve their model through further collaboration with GOSH and other communities.

Reference Number: CDF-100

Documentation

Event Media

Forum Posts

No items found