I has taken 8 months to complete this writeup, but better late than never. It’s been a mixture of life and things going (too) well at Oxbotica.
Anyway, the TL;DR is that this is a summary of my trip to Borneo in May this year with v2.0 of my drone based orangutan tracking system, developed in partnership with International Animal Rescue (IAR). The story of v1.0 and why one would uberhaupt need to track orangutan’s in the jungle, and why using a drone is useful can be found in this post.
As an interesting aside, the v1.0 system got lost in a tree somewhere last time. But a few months after my return somebody randomly stumbled across the drone bits in the jungle (see top right picture in the collage below).
I am very fortunate to have worked in many stimulating environments, on many very different applications, and with some very awesome people. Since about 3 months Im happy to see that trend continue.
I recently joined Oxbotica, a startup out of the Oxford University Mobile Robotics Group (MRG), specialising in mobile autonomy. For various reasons it wasn’t a straightforward decision but so far has panned out well. Within the company we are developing Selenium, a cross vehicle and cross platform autonomy stack that third parties can license in whole or in part. So note this is about building and licensing software. Not cars.
Update: read part II here
TL;DR: You are faced with a few thousand hectares of rainforest that you know harbours one or more orangutans that you need to track down. Where, how, and why do you start looking?
About a year ago I was doing a lot of drone related work and was presented with the following problem: Would it be possible to use a drone to fly above the Bornean jungle and search for tagged orangutans?
To understand the motivation behind the question we need some background.
In my last post I discussed my little flying object detector project that I’ve been doing for fun. While it worked it relied on communication with an external laptop in order to work (the on board odroid was not powerful enough to run the convnet model).
Hence, what better excuse to buy an NVIDIA Jetson TK1 dev board which should have more than enough juice to run everything on board the drone itself. As added benefit it should come in useful for my litter robot. I then added websocket support to the flask web app and you can now see the detections appear in real time on a map.
It took some fiddling to get the wifi to work with the Jetson and Im still surprised at how quickly the wifi degrades (even though Im using the 5GHz band to avoid collisions with the Tx/Rx). But it did work in the end:
It wasn’t the best place to test it out but I was short on time and it should be good enough to illustrate the concept
Still lots to do and too little time, but we keep chipping away. In particular the awesome folk from ErleBotics were kind enough to send me a Brain2, looking forward to try that out as well.
PS: on a related note, love what the guys from vertical.ai are working on
About a year and a half ago I saw Jetpac’s video of their Spotter app and I remember thinking at the time that it would be so cool to get this flying on a drone. I didn’t have the bandwidth to work on it at the time but ended up poking at it with Markus Aschinger at the ASI and with two A level students (Jawad / Isaac) from the Nuffield Foundation. While they did good work and it got me a step closer, it still hadn’t quite come together. Hence I sat down the past week to do a full rewrite, integrate it with a quad I had lying around and do a little demo. The result can be seen in the video below.
The last in a series of 3 posts (one, two) around the UAV based GPR system I have been working on.
There are many applications for GPR but the one I have been focussing on the most has been landmine and UXO detection. GPR is a popular sensor for this but by no means the only one. From big complex neutron sources to small thermal cameras and magnetometers a wide variety of sensors have been developed to detect landmines.
While we were focussing on GPR I wanted to explore what other sensors we could mount or fly alongside the GPR and that could provide a complimentary picture. This then crystallised into a summer project for Skycap intern Alex Davey. After some in-depth research into the spectrum of relevant sensors the decision was made to experiment with an active EMI sensor (i.e., metal detector).
Besides the machine learning angle discussed in the previous post, the UAV based GPR system that I have been working on has involved an interesting foray into robot swarming technology. The reason being that the sensor places a number of restrictions on the operation of the aerial robot. In particular the swath that can be covered in one pass is rather limited. There are different routes to ameliorating that and one of them is the use of multiple drones to increase coverage rates.