Eavesdropping on Nature
DIY Bioacoustics is a project focused on the fruitful entanglement of design, science, sound and the public sphere. Our goals are to advance both design and science by “thinking about the future of science in the context of design–as well as design in the context of science” (Ito, 2016) and to prototype the process in a way which is in accordance with open source and DIY methodologies. We are developing an open source and DIY sensor/service for biologists, using sound recordings to identify and track different species of leafhoppers through the different calls they make in order to monitor crop health. The sensor could also be utilised by citizen scientists, farmers and visual artists/computational designers.
Project Summery and Scientific Basis
We will be developing an open source and DIY sensor/service for biologists, using sound recordings to identify and track different species of leafhoppers, to monitor crop health remotely. The sensor/service could also be utilised by citizen scientists, farmers and visual artists/computational designers. Non-invasive bioacoustic monitoring has become an increasingly effective way of monitoring ecosystem diversity and health. Bioacoustics paired with machine learning has been cited as an effective way of automatically identifying animals such as frogs (Xie, 2017), birds (Zhao et al, 2017) and fish (Sattar et al, 2016) amongst other animals. Bioacoustics is an area of scientific research which would benefit from (i) continued expansion of machine learning and automated identification of insect species (ii) creation of open source hardware for conducting research. Our aim is to contribute to (i) by applying bioacoustics and machine learning to insect recognition and to (ii) by creating an open source, diy and hackable acoustic sensor for identification of various insect species. Recent work on insect recognition and intelligent traps is seen in (Silva et al, 2014) and on methods of creating low cost sensors for insect recognition in (Silva et al, 2015). Problems with ambient noise in traditional acoustic recording are identified in (Chen et al, 2014) and they show how low cost optical sensors provide more accuracy and high data capacity than previous methods. We hope to build on this work by creating an innovative open source model and associated hardware for conducting research into insect ecosystems. Insects are vectors of diseases while also pollinating a large proportion of the world’s food production. Further to this, they also constitute a growing food market which is expected to be worth 55 billion dollars by 2023. (Global Market Insight, 2016). Our aim is to contribute to ongoing research into insect recognition and ecosystems by applying bioacoustics and machine learning to insect recognition and to the democratisation of scientific research by creating an open source, diy and hackable acoustic sensor for identification of various insect species. The final sensor would be non-invasive, weatherproof and wireless and would link to a dashboard, displaying visually the patterns and statistics gathered which could eventually be linked to and open data structure such as wiki data for sharing information about various environments throughout the world. In the first phase of development we will be using already available kits on the market for rapid prototyping and proof of concept. The second step will be creating a custom PCB which would fit all of the necessary components and circuitry. On the software side, we will be relying on open source projects, such as SuperCollider. We hope that our research will contribute to the development of the use of bioacoustics in the study of insect ecosystems while also furthering the democratisation of science.
Collaboration with Filippo Sanzeni, Davin Browner-Conaty Alice Potts and John Innes Centre
Selected and Funded project by Biomaker Challenge 2017, Department of Plant Sciences, University of Cambridge
Exhibition: Open Technology Week (Biomaker Fayre) @ Cambridge, October 2017
Fall 2017 @ Royal College of Art and University of Cambridge
Copyright © 2012-2018 by MINWOO KIM. All rights reserved.