You have to love the way journalists make a headline. They get most of the details of the actual project pretty right though
Below is a talk from the recent New Zealand AI conference. It gives an up-to-date summary of the project with particular reference to how we are using Machine Learning.
As with all projects when the fun development parts are proven there is refinement required to make the devices more robust and usable. Below are a few photos of the next iteration of our hardware that makes the product more reliable and flexible.
We have been invited to present our work at New Zealand's AI conference. There is currently huge interest in exploring applications for AI around the world and in New Zealand. The Cacophony Project is a great example of how this field can be used to tackle a very New Zealand specific problem.
To book your tickets to the event click here.
The first step required to become 100% predator-free is to know for sure what type of predators are out there. The most common methods used to do this are tracking tunnels and chew cards. These tools require significant manual work and miss a lot of predators. Tracking cameras do a much better job of tracking predators and detect 2-10 times more than tracking tunnels.
Latest version of the Cacophonometer can be downloaded from link below. In this version:
With the generous help of Willowbank Wildlife Reserve we have been collecting thermal video footage of kiwis to help train our machine learning based animal classifier.
Here's a sample of the recordings we've been collecting...
The Machine Learning (artificial intelligence) identification has two main parts. A tracker locates any objects of interest and separates them from the background. These tracks are then passed to the classifier which analyses them to identify what sort of animal they contain.
Previously we used a static camera so the tracking part of the problem was quite easy as only the animals moved, whereas background objects (such as trees) stayed relatively still. This is obviously harder when the camera is moving. Below shows some examples of the new tracker working with the moving camera.
In previous blog post we have talked about the advantages of having a camera that can track animals. It will obviously allow better identification but also the ability to eliminate with something like a poison paintball.
The video below shows the latest experiments. We are just using a radio controlled car for testing but you will get the idea. It is a little amusing to see the artificial intelligence try to guess what the radio controlled car is.
We now have many thermal cameras deployed at various locations collecting recordings every night. This means we have a lot of thermal video footage to manually tag every day so that they can be used to improve our machine learning classifier. As I write this, we have collected almost 30,000 thermal video recordings.
Our recordings include many false positives - where a non-animal object triggered our camera's motion detector. We also have many, many recordings of birds - particularly at dawn and dusk. Filtering through all the false positives and bird footage is very time consuming.
This is just a quick peek at some very encouraging results showing that Artificial Intelligence can successfully work out what it is seeing from the video. We will put up more discussion and details next year.
The raw video is on the left. On the right, the animal is identified and the cumulative classification of the animal is at the top. The instantaneous guess from the Artificial Intelligence is changing in real time at the bottom.
Video 1: Example of classification of animals using AI
Hi everyone, my name is Matthew, I have been brought on to help with the machine learning side of things. I’m very excited to be part of this project.
My job here is to take all the thermal footage we have been recording and identify the animals in it.
It has been shown that trail cameras are much better at detecting predators compared to tracking tunnels, which are the default detection tool. Trail cameras are between 2-10 times more sensitive at detecting predators than tracking tunnels depending on species. This is obviously very important as we need to be able to measure all predators that are out there.
Hi everyone, I’m Finn. I’ve been involved with Cacophony for a year now. I started out on work based around the Cacophonometers from a hardware and business model angle. Most recently I’ve been working on software and the development of a method to analyse all of this birdsong we’re getting! This post will describe the work I’ve done and basically what The Cacophony Index 1.0 is.
In previous work we thought we had the ultimate predator detection camera. But our goal is to detect all predators so we have chosen to develop a higher resolution heat camera. Here's a reminder of why we want to detect all predators:
It's been a little longer than usual since our last update. We've been busy!
We achieved some fantastic milestones last week!
Here's another look into what we've been up to recently.
It's been a busy week for the Cacophony Project, with good progress on many fronts.
As noted earlier, the project now has a new lead developer & project manager (me!) and I've spent much of the week getting up to speed. I've been reviewing the current state of our hardware and software efforts, meeting various advisors and contributors, and getting a grasp of our immediate and longer term goals. It's been fantastic to meet everyone and I'm really excited to be onboard. There's a real energy around the project.
My name is Menno Finlay-Smits and I start today as the Lead Developer and Project Manager for the Cacophony Project. We've got a real chance at solving the problem of invasive predators in New Zealand and I'm really excited to be part of the effort.
July update - now that's sounding like I might do regular updates...
What have I been doing?
The NEXT Foundation and ZIP have recently invested in the Cacophony Project and this will enable us accelerate our development plans.
As part of the NEXT investment they also put together a short video that very nicely explains what we are up to. They did a great job with this and it has proved very helpful in explaining concisely what we do.
The latest innovation we are testing is a camera that can rotate around, up and down to follow predators. There are two main benefits of using this. First, we can see much more than with a static camera. The second, main long-term reason is that this will allow us to implement a kill mechanism that will not require the predator to enter a trap.
Video 1: First prototype of tracking camera
Below are a few observations from our experiments with live capture traps. As noted in our previous blog post, they seem to work better than other traps we have tried. We know they are not the long-term solution, but we need some devices that are more guaranteed to take out predators while we trial different audio lures. We don't want predators to learn and ignore audio lures that we test.
Below is a video showing a few interesting things about the live capture. We explain these in more detail below.
Some traps seem 10 times more effective than others (and other interesting things we have learned from our new camera tool)
The main purpose of our camera and trapping over the last four weeks has been to test the new camera set up and get as many videos of predators as possible. We can then tag them in a way that artificial intelligence can learn the difference between predator species.