The Machine Learning (artificial intelligence) identification has two main parts. A tracker locates any objects of interest and separates them from the background. These tracks are then passed to the classifier which analyses them to identify what sort of animal they contain.
In previous blog post we have talked about the advantages of having a camera that can track animals. It will obviously allow better identification but also the ability to eliminate with something like a poison paintball.
This is just a quick peek at some very encouraging results showing that Artificial Intelligence can successfully work out what it is seeing from the video. We will put up more discussion and details next year.
It has been shown that trail cameras are much better at detecting predators compared to tracking tunnels, which are the default detection tool. Trail cameras are between 2-10 times more sensitive at detecting predators than tracking tunnels depending on species.
In previous work we thought we had the ultimate predator detection camera. But our goal is to detect all predators so we have chosen to develop a higher resolution heat camera. Here's a reminder of why we want to detect all predators:
The NEXT Foundation and ZIP have recently invested in the Cacophony Project and this will enable us accelerate our development plans.
As part of the NEXT investment they also put together a short video that very nicely explains what we are up to. They did a great job with this and it has proved very helpful in explaining concisely what we do.
The latest innovation we are testing is a camera that can rotate around, up and down to follow predators. There are two main benefits of using this. First, we can see much more than with a static camera. The second, main long-term reason is that this will allow us to implement a kill mechanism that will not require the predator to enter a trap.
Some traps seem 10 times more effective than others (and other interesting things we have learned from our new camera tool)