In the previous post I’ve outlined the hardware build of a “Robocar”, a simple autonomous car platform using monocular vision and Deep Learning, using a small RC car with few modifications. The post focused exclusively on the hardware. If you’ve followed the directions in that post, you should be able to customize your RC car with a simple wooden or plastic platform, a Raspberry Pi, a camera and a PWM HAT1 that can control a motor and a servo. For my build I’ve also added an RC receiver, since my NAVIO2 HAT supports decoding of SBUS and PPM signals out of the box. However this is optional, and there are many ways to control your car, depending on what you have available (WiFi, for instance).
Even though the hardware is essential to a functioning autonomous robocar, at it’s heart it’s the software and the algorithms that enable autonomy. In this post we will be focusing on building a simple software stack on the Raspberry Pi that can control the steering of an autonomous vehicle using a Convolutional Neural Network (CNN).
Background and Aims
Let us elaborate on the background and our goals a bit. As mentioned earlier, the aim of this project is to build a car that can navigate itself around a course, using vision and vision alone. Not only that, but decision making regarding steering and throttle all happen as part of a single CNN, which takes the image as input and outputs values corresponding to steering and throttle2. This type of decision-making known in machine learning as an “end-to-end” approach: Information comes in raw at the input, and the desired value is presented at the output. The neural network needs to infer suitable decision making procedures as part of it;s training. End-to-end training is just one of a number of different approaches for autononous vehicles. Another popular one is the so-called “robotics” approach, where a suite of different sensors (vision included) are “fused” together algorithmically to produce a map of the vehicle surroundings and localize the vehicle within it. Then, decision making takes place as a separate step, and sometimes consists of hand-coded conditions and actions.
It is not the place in this blog post to debate the merits of one approach vs the other. The truth may lie in a compositional approach, for all we know 3 Taking into account, however, the simplicity of this project and it’s DIY roots, as well as the recent leaps in self driving vehicles achieved by end-to-end neural net approaches, I feel it’s worth a try. And so did quite a few people, including the Donkey team, whose default CNN model and pieces of code we’ll be using in this build.
This car build uses the Burro autonomous RC car software, freely available on Github. Burro is an adaptation of Donkey for the NAVIO2 HAT. While it borrows a lot of features from Donkey, Burro has a number of significant differences:
There is no separare server instance, all telemetry is served by an onboard web socket server
RC(SBUS) or gamepad (Logitech F710) are used for control of the car
It is adapted for use with (and requires) the NAVIO2 board’s RC decoder, PWM generator and IMU (gyroscope)
Currently Burro requires a Raspberry 2 or 3 board with the NAVIO2 HAT. Before proceeding with the installation of Burro, you will need to have a working EMLID image installation. The latest version is strongly recommended. Please make sure you follow the instructions available in the relevant EMLID docs.
Once this is complete, ssh to your Rpi, which by default should be navio.local, if using the EMLID image.
wget the Burro install script
change permissions and run it
chmod +x install-burro.sh
This will install all required libraries, create a virtual environment, clone the Burro repo and set it up, and create symlinks for you. After successful completion, you end up with a fully working installation.
A warning: Some steps of this script can take a significant amount of time, especially the numpy pip install step, which needs to happen due to library incompatibility with the apt-get versions. Total installation time should be around 30min. To ensure that your installation is not interrupted midway, make sure that you run your Pi out of either a power supply that can supply at least 5V/2A, or a fully charged power bank or LiPo of sufficient capacity.
I am using the software with a Turnigy mini-trooper 1/16 RC car. If you have the same car, you need only change your RC channels if necessary. RC Input channels are as follows: 0 – Yaw (i.e. steering), 2 – Throttle, 4 – Arm. Yaw and throttle are configurable via config.py, but Arm is hardwired to ch. 4. Each time the RC controller is armed, a neutral point calibration is performed. Thus, you only need to make sure that your sticks are center before arming the car.
By default Burro outputs throttle on channel 2 of the NAVIO2 rail, and steering on channel 0. You may wish to change this.
You may also wish to configure the throttle threshold value above which images are recorded.
See the Readme in the Burro repo for more instructions on how to edit your configuration.
After installation and configuration is complete, you should be able to drive your car around, either using the manual controls, or using a mix of CNN for steering and manual controls for throttle. Automatic throttle control is not yet available, but it will be in a future version.
To start a Burro instance, first ssh to your RPi, if you havent done already:
Then type following, from the place where your install-burro.sh script was located:
Point your browser to your RPi address (http://navio.local by default for the EMLID image), the telemetry interface will come up. Choose your driving mode based on your controller. The default is using the F710 gamepad for steering and throttle. There are options for RC, gamepad, and mixed RC+CNN and gamepad+CNN driving, where the CNN controls the steering and you control the throttle. Autonomous throttle control is not yet implemented in Burro.
I like to think the Burro project as being part of the lively Donkey community, since it has been spun out of Donkey after all. As such it is worth taking a look at many resources created by the Donkey developers, namely:
If you’re interested in the development of autonomous small scale vehicles, you may wish to be part of the Slack community of Donkey, by requesting an invite.
This is the second post in a series discussing the software aspects of a small scale autonomous vehicle, using vision alone and end-to-end machine learning for control and navigation. The Burro software was briefly presented, together with installation instructions.
Autonomous vehicles is a very young and promising field of AI, and certainly we will be seeing very interesting competition in the near future.