September 2011 saw the launch of the AR.Drone FreeFlight app for Android. For those of you who have not heard of Parrot’s AR.Drone (and I don’t think there’s many of you out there), it is an augmented reality quadcopter with two cameras, proclaimed Gadget of the Week by TechCrunch in May 2012, now well in its second version. As TechCrunch put it, the AR.Drone first took to the sky back in 2010 at CES, and since, iOS users have been the only ones lucky enough to get in on the fun. Lemberg, in its turn, was lucky enough to create the Drone piloting application for Android users. Technical details aside, this is the story of how FreeFlight Android came to be.
Back in March ‘11, Lemberg got an email from what was later called by TechCrunch “the Company Behind The ‘Flying Smartphone" - Parrot. The email said they were looking for someone who could help them with Android NDK. Now, Android NDK is very different from Android SDK. NDK is native coding and not Java coding (SDK), which is widespread. And Lemberg just happened to have the right guy for this job. All the fuss around signing the contract aside, two Parrot’s choppers arrived in our office a couple of weeks later.
Then came the hard part - development. Below is a detailed account from Lemberg’s NDK shaman about the first takeoff of the Android-controlled AR.Drone and everything that followed.
FreeFlight 1.0: it flies!
Development of the FreeFlight 1.0 started with a challenge. One of the two devices that we had to support had serious performance problems with displaying a video stream. As video streaming is an essential part of the application, we had to find a solution and spent a few days looking for one. The strange thing was that between these two identical devices with the same hardware there was a huge drop of video frame rate (up to 5fps). As it turned out, it was a bug in the firmware of one of the devices and there was no hope to fix this by firmware update or something. But luckily we tried a different way of video reproduction (the OpenGL solution was the problematic one) – using the Android Canvas API. As a result we ended up with two implementations of the video display – the OpenGL method and the Canvas API one. We used different approaches on different devices for better user experience.
When we had a live video working we started to work on another interesting feature of the app – controlling the drone. I must say it was the best part of all the development process. I was the happiest programmer in the world when that thing flew for the first time! Of course it crashed eventually, but for an instant it was up in the air all by itself - amazing!
The next challenge was to use the maximum number of sensors to control the drone – accelerometer, magnetometer and gyroscope. Here device fragmentation comes into play. As you probably know there are a lot of Android devices on the market. Some of them don't have motion sensors, some of them have accelerometer only, some have magnetometer in addition and some have all three sensors. To minimise delay between movement of the device and response from the drone we had to use different algorithms that depend on different number of sensors. Also we used filtering techniques (high pass filter, low pass filter, etc.) to filter out some noise that we get from the sensors. This results in smooth controlling of the drone using the "accelero mode". If a device didn't have any sensors, we had to use the on-screen joypads.
Another issue that is related to controlling – some devices (like HTC Nexus One) have issues with the touch screen. They do not support independent finger tracking. So we had to invent another controlling mode, where a user should use only one finger.
Implementation of the rest of the features was just a matter of time. Firmware upgrade, drone configuration and some other UI controls were implemented without any major issues.
FreeFlight 2.0: brand new UI
Development of FreeFlight 2.0 started from new challenges and new drones – 2.0. We received two brand new drones 2.0 with HD camera, new sensors and looks. I believe we were among the first lucky few who had a chance to fly that thing before it got on the market. But still a lot of work is to be done in order to get everything to work properly. And from now on both drone versions should be supported.
FreeFlight 2.0 has brand new user experience and UI. Also many new devices were released and new class of devices become available – 7’ tablets. Of course we had to add support for those. Also, we had to add support for non Google enabled ones – Amazon Kindle Fire and Barnes & Noble Nook Color. The variety of Android versions that we need to support starts from Android 2.2 and ends with the most current one – 4.2 Jelly Bean.
With all that in mind we started the work. And there were some new challenges to be solved here.
The first one – custom font for the whole application. If you are an Android developer, you should know that there is no straightforward way of applying some font to the whole application. We solved this problem by creating a special base class for activity that applied the font automatically without a developer needing to worry about that. Just extend that base activity and you're done.
The second one – performance. Once more we faced the performance issue on the screen with live video from the camera. The UI now had many more UI controls and elements and as a result the Android rendering system had trouble merging content from GLSurfaceView (video in our case) and the layout on top of it in real time. As a case in point, the battery on my Galaxy S2 was running out even when plugged to the power adapter. Also the frame rate of the entire UI had a big drop. We solved this issue by re-implementing UI for that screen using the OpenGL only. As a result we just had a single GlSurfaceView in view hierarchy and the issue was tackled.
The third one – localisation. As the application should support around 8 languages, some screens had problems with long words. Especially advertisement screens and screens with hints for the user. The solution was to make a copy of each screen for each localisation and adjust font size so that the words fit in perfectly.
The fourth one – video recording. Most of the video recording functionality is implemented in the ARDroneLib – the API for the drone. But we faced the problem with supporting Drone 1 video recording. The problem was that Drone 1 uses video codec that is not supported by any Android device. That means that video recorded by Drone 1 is not playable. Solution for that was to use video decoding/encoding library (FFMPEG in our case) to re-encode the video into mp4. This is done right after you stop the recording of the video. Re-encoding is done in the native code, so it doesn't take too long for that, especially on high-end smartphones.
As for the rest of the features, the implementation was pretty straightforward. Photo gallery, new dashboard, new settings (two variants for different drone versions), fresh UI, advertisements, hints – all that you can find in FreeFlight 2.0.
FreeFlight gets bigger. You probably noticed that version 2.1 is skipped. This is because we are trying to catch up with the iOS version and are implementing all the features from FreeFlight 2.2 iOS.
This release of the app is all about the AR.Drone Academy – a social feature that allows to share flights, videos and photos with other users. It has maps (ability to show flights on a map), ability to view other users’ photos and videos, ability to show flight data as a graph and map your photos and videos to that graph, ability to download photos and videos from a USB drive. Yes, there are drones that have USB port and are able to store videos on external flash drive. Automatic upload feature can now upload all your flights right to Picasa and YouTube. Also the most interesting feature is the ability to see your recorded video with flight graph on top of it in real time.
Like before, the development started with receiving two drones with usb connectors. Keeping all the experience from the previous releases we did some refactoring, so the source code become less dependent on native code.
In order to implement screens with graphs and indicators we had to implement a few custom controls such as graph and height ruler. We use them on different screens to display flight parameters such as height, speed and battery level. Also we can put thumbnails of photos or videos on top of the graph, so you will know exactly the point where the photo or video was taken.
As for new devices support, now we have to support almost every screen size that is out there, from low end 320x240 to Full HD and Retina like displays like in Nexus 10.
But the most challenging part was to synchronise video with the graph data. We had to use special data that is stored inside the video file. Before video is played – we get that data and use it for synchronisation.
At the moment of creating this post the development of FreeFlight 2.2 was in progress, and it has been already released!
Adding new devices
Oh, and I have almost forgotten about some exotic devices that we now support. Parrot wanted to explore the potential of porting FreeFlight to other Android-based devices. First, Android tablets were added to the list. Later on these were also Android-powered devices, like:
- Sony Google TV
The interesting thing about Google TV from Sony is that it has an accelerometer inside the remote. To add support of it we had to use external source for accelerometer data and make some adaptation of the UI.
- Nook tablet
- Epson Moverio Glasses
Another interesting device is the Epson Moverio glasses. This is an Android 2.2 device with the display in the glasses. The problem with that device was that it doesn't support multi touch, it has no motion sensors and very few buttons. The tricky part was to come up with the solution of how we can control the drone with all these limitations. As a result we came up with single on screen joystick mode that has been used to control forward/backward/left/right movements and hardware joypad that controls left/right turns and height of the flight.
For more information on developing applications for other such latest devices like Parrot’s AR.Drone check out our experience section Apps for Connected Devices.
Lemberg is a UK mobile and web development company with strong client base in the UK, Europe, and the USA.
Starting from 2007, Lemberg has been helping leading design and marketing agencies, start-ups, innovative businesses deliver brilliant digital solutions for a number of the world’s biggest brands.
Our goal is to go beyond clients’ expectations: as a technology partner, we take the responsibility for implementing the most ambitious, creative and innovative ideas.