Here's another look into what we've been up to recently.
It's been a busy week for the Cacophony Project, with good progress on many fronts.
As noted earlier, the project now has a new lead developer & project manager (me!) and I've spent much of the week getting up to speed. I've been reviewing the current state of our hardware and software efforts, meeting various advisors and contributors, and getting a grasp of our immediate and longer term goals. It's been fantastic to meet everyone and I'm really excited to be onboard. There's a real energy around the project.
My name is Menno Finlay-Smits and I start today as the Lead Developer and Project Manager for the Cacophony Project. We've got a real chance at solving the problem of invasive predators in New Zealand and I'm really excited to be part of the effort.
July update - now that's sounding like I might do regular updates...
What have I been doing?
The NEXT Foundation and ZIP have recently invested in the Cacophony Project and this will enable us accelerate our development plans.
As part of the NEXT investment they also put together a short video that very nicely explains what we are up to. They did a great job with this and it has proved very helpful in explaining concisely what we do.
The latest innovation we are testing is a camera that can rotate around, up and down to follow predators. There are two main benefits of using this. First, we can see much more than with a static camera. The second, main long-term reason is that this will allow us to implement a kill mechanism that will not require the predator to enter a trap.
Video 1: First prototype of tracking camera
Below are a few observations from our experiments with live capture traps. As noted in our previous blog post, they seem to work better than other traps we have tried. We know they are not the long-term solution, but we need some devices that are more guaranteed to take out predators while we trial different audio lures. We don't want predators to learn and ignore audio lures that we test.
Below is a video showing a few interesting things about the live capture. We explain these in more detail below.
Some traps seem 10 times more effective than others (and other interesting things we have learned from our new camera tool)
The main purpose of our camera and trapping over the last four weeks has been to test the new camera set up and get as many videos of predators as possible. We can then tag them in a way that artificial intelligence can learn the difference between predator species.
We have made some improvements to the video capture tool we have been developing to make it better able to identify all predators in the New Zealand environment. We have been testing our device next to a standard commercial trail camera. These are typically designed for hunting deer and pigs etc, so they are not tuned for smaller mammals (unless up very close). The device we are using has the following advantages that are described in more detail below:
I once heard that when you think an IT project is 99% done, you are probably half way there. A lot of work has been done on the Cacophonometer over the last few months and it is now ready for you to try it out.
The Cacophonometer is an app that runs on Android phones that makes regular sound recordings. It can be used to gather ‘base-line’ and ongoing data of the bird song near the device.
One of the problems for the Cacophony Project to resolve is stopping the device from recording human voices (to avoid any issues with privacy). I've been performing some backyard experiments with some promising results!
NOTE: This position has been filled.
We are hiring so would love to hear from you if you would like to join the team to help make New Zealand predator free. Job description below.
Lead Developer and Project Manager
The short video below was presented at TEDx is a summary of the Cacophony Project. Feel free to share with anyone you think may be interested and sign up to our newsletter to hear about the latest developments (see bottom of page for email sign up).
Video 1: TEDx talk summarising The Cacophony Project
My name is Tim Hunt and I am a lecturer in Information Technology at Waikato Institute of Technology (Wintec) in Hamilton, New Zealand. I first heard about The Cacophony Project, and the work that Grant Ryan and others are doing, at this year's ITx conference in Wellington. Grant's presentation outlined the 'big picture' aims of the project - to return New Zealand's bird life to its former glory, by using technology to help eradicate the pests responsible for killing so many birds.
A team is looking to set up a company to sell and distribute devices and services that use the research The Cacophony Project has been working on. The device records background bird song every hour for a few minutes and then uploads the data to the cloud. This way a reliable time series can be collected and there will be increasingly sophisticated ways that these will be able to be analysed to give you real objective data on the trends of bird life in your area.
The Cacophony Project was fortunate to win the science category of New Zealand Open source awards this year.
“Genetic engineering”. The very mention of it is enough to provoke a furious waving of hands followed by a litany of rhetoric that wouldn’t be out of place in the Old Testament.
The Cacophony Project is not a science project in the same way it is not a business project (link). However there are lots of ways the tools created by our project will be able to be used by scientists. Below are a few examples that we think are particularly interesting.
The Cacophony Project is completely open source. The hard/software component can be used gratuitously in accordance with their licenses. Given this, how can you possibly create a business from it? The short answer is the same way people build businesses selling water
A new device with both heat and infra-red cameras connected to the cloud looks to be a more automated and sensitive tool for monitoring predators
Just a quick shout out to Spark. Some folks at Spark heard about our project and immediately offered to help. From the first meeting we walked out with some sim cards with free data to play around with. No paper work, no delay, just a quick “yeah that sounds good – go for it”. We can see lots of ways their team and services can help accelerate this project over time.
Many thanks to Andrew Pirie, Paul Deavoll and Andrew Leckie from Spark.
Our design philosophy is to create a device that can:
- Identify 100% of the predators out there – unless you can do this there is no way to know how well you are progressing (looks like we are very close to a device that can do this - link)
- Lure 100% of the predators – there are lots of possibilities with using digital lures (link)
- Kill 100% of the predators
This project initially started with the goal of creating a tool to dramatically improve monitoring how the environment responds to predator eradication. In New Zealand we are lucky that the bird song is literally the canary in the mine:
Firstly, we love the government’s new announcement of the target to be predator free by 2050. Bold, ballsy and just the sort of thing New Zealand should do. What we would like to do here is a little analysis of why we think it will happen sooner than that.
Moore's law is the observation that, over the history of computing hardware, the number of transistors in a dense integrated circuit doubles approximately every two years. This has morphed into a remarkably consistent observation that Information Technology either doubles in performance or halves in cost about every two years
Trapping, poison, hunting, and fences have all been turned against pests to help prevent the ongoing slaughter of New Zealand birds. However, we know that current trapping methods aren’t cutting it. As part of our experimenting we have had video trail cameras pointing at existing (traditional) traps to measure how well different digital lures work. What has completely surprised us is that as little as 1% of the time that a possum turned up on camera a trap was activated.
The goal of this project is to develop tools that eliminate 100% of predators. To do this the device must therefore be able to detect 100% of predators. Chew cards and tracking tunnels can miss over 60% of predators. Standard camera traps are thought to miss as little as 5% of predators due to not starting fast enough or the light/sound scaring animals away. There are also issues with false positives making it difficult and time consuming to filter the videos
By Brent Martin, adjunct senior research fellow, University of Canterbury
The aim of this project was to see if the latest Machine Learning (Artificial Intelligence) tools could correctly identify the difference between rats, stoats, possums and others from videos collected from the field. This project was given to a set of 28 final year honours students at University of Canterbury.