Deep Drone: A Flying Object Detector with caffe, dronekit, and zeromq

mapAbout a year and a half ago I saw Jetpac’s video of their Spotter app and I remember thinking at the time that it would be so cool to get this flying on a drone. I didn’t have the bandwidth to work on it at the time but ended up poking at it with Markus Aschinger at the ASI and with two A level students (Jawad / Isaac) from the Nuffield Foundation. While they did good work and it got me a step closer, it still hadn’t quite come together. Hence I sat down the past week to do a full rewrite, integrate it with a quad I had lying around and do a little demo. The result can be seen in the video below.

Unfortunately, it being autumn and this being the UK there was pretty much no window of decent weather to do a nice flight. Instead I had to make do with muddy boots, blustery winds, and frantic umbrella breaks due to rain showers. It was hard to keep the quad under control in the wind and hence the video is not as smooth as I’d like. Once the weather improves and I add a gimbal, things should be much better.

2015-11-12 10.28.31

Waiting for the rain to pass

The quad itself was a H500 Pixhawk/APM based system, carrying an Odroid XU4 and associated camera. The classification model used was the model that Krizhevsky et al. famously used to win Imagenet 2012. While it was a great achievement at the time, that model has now been superseded by better ones. However, its a good starting point and updating the model to a more accurate one is straightforward (if slower).

I don’t yet (?) have an Nvidia Jetson board and my Percepto has been delayed so I settled on an Odroid as the embedded computer. To my knowledge pretty much the fastest system out there for its price point. However, while I could run everything on the Odroid it was too slow so I split up the (python) code into three parts and had them talking to each other over wifi using ZeroMQ. The resulting workflow was as follows:

  • A video grabber runs on-board, grabs frames and sends every nth frame off to the classifier.
  • A classifier runs on a local laptop (my macbook), classifies each frame it receives and returns the top three results.
  • The video grabber merges the returned results with the output stream and publishes everything over http through a flask webserver (also running on board).

The nice thing is that each component (video grabber, classifier, web app) can be started and stopped independently and connection losses don’t cause things to crash.

With the odroid connected to the Pixhawk, dronekit makes it trivial to log the flight controller state along with the classification results. The output is a nice csv which can be easily plotted to produce a rough map of the where the objects are.

map

Note we only know the location of the quad itself and the direction it is looking. This can be quite different from the location of the object itself. Getting the exact location of the object requires a few extra hoops to jump through and what I hope to do next. I would also like to generate the map in real time and just need to sit down and get some websockets and d3 plots going.

Beyond that there are some ideas about how to use the object detector to determine the flight path of the drone itself and using a customised model instead of a general purpose object detector.

I plan to put all the code on github once I have tidied it up a bit.

All feedback and comments welcome.

–Dirk

13 thoughts on “Deep Drone: A Flying Object Detector with caffe, dronekit, and zeromq

  1. Pingback: Dirk Gorissen posted a blog post… | Drones News Magazine

  2. Pingback: Deep Drone: A Flying Object Detector with caffe, dronekit, and zeromq - dronespain.es

  3. Pingback: Deep Drone: A Flying Object Detector with caffe, dronekit, and zeromq... - : ZeroDrift Drones, UAV, sUAS

  4. Pingback: Deep Drone: A Flying Object Detector with caffe, dronekit, and zeromq - Drone Lobbyist

  5. Pingback: Deep Drone: A Flying Object Detector with caffe, dronekit, and zeromq - Quadcopter Blog

  6. Pingback: Deep Drone: A Flying Object Detector with caffe, dronekit, and zeromq – NaturePonics, LLC.

  7. Pingback: Deep Drone v2 | Machine Doing

  8. The problem with both TK1 and TX1 is the weight and power requirements. The TX1 in particular has a massive aluminium heatsink and fan on it. I think you’ll need a big quadcopter and a big battery to get any kind of results.

  9. I am very interest in your project,can you give me some advice on how to use the zeromq transfer video captured by odroid but can be processed on the laptop?

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s