Circles of Concern: A Robotics Spin

Photo by travelnow-or-crylater-721285-unsplash
Share on facebook
Share on reddit
Share on twitter
Share on linkedin
Share on email
Share on print

By David Linn

Developing a 1/7-scale Autonomous Race Car

In this article, I will discuss the engineering process behind developing a 1/7-scale autonomous race car from a NetBurner microcontroller, a multitude of sensors, and a hobby-grade chassis. I explain how I overcame challenges faced while interfacing low-level hardware, and then discuss my thought process while working on mapping the car’s environment for high-speed autonomous navigation and visualizing the results. Finally, I present my novel obstacle avoidance algorithm, dubbed “Circles of Concern.” This algorithm minimizes computation, which is useful for emergency swerving or racing applications. All code is available on GitHub for reference.

Part 1: Intro

Thanks to the wonderful internship program at NetBurner, I had the opportunity to continue Paul Breeds’s autonomous vehicle  project this past summer. Last year, the car suffered a soul crushing defeat at the 2017 SparkFun Autonomous Vehicle Challenge when, on its way to victory, it suddenly stopped inches from the finish line. I sought to research the practical implementation of autonomous navigation algorithms that could be used on a NetBurner module.

I started by taking Paul’s vehicle platform and hardware, featuring a NetBurner NANO54415 system on module, with some minor alterations, and began to rewrite much of the code. Last year’s article took you through the “trials and tribulations” of the experience and focused on the end product. This article will focus more on the engineering process for hobbyists who want to get started with autonomous navigation. Because of the scope of this project, which involves both low-level microcontroller programming and high-level algorithms, this article may also be helpful for hardware engineers who want to take advantage of computer science research on autonomous navigation or software engineers who want to learn more about low-level interfacing.

Race car chassis with controlled by a Netburner System on Module. Sensors include LiDAR and IMU.

Part 2: Interfacing Sensors and Other Components

The first thing I learned was that large engineering projects can only be tackled by breaking them down both horizontally and vertically. At first, I attempted to alter Paul’s code (written for a different version of NetBurner’s development tools) and integrate the RC signal, steering, and throttle to drive the car around. When it didn’t work, I felt overwhelmed. Upon a suggestion from a senior engineer, I established several smaller goals: spin the wheels for one second without input, alternate the steering between left and right without input, and print out values received from the controller. This was much more manageable.

The RC receiver transmits raw DSM2 data over serial, which dsm2.cpp decodes. The servo and Mamba X Electronic Speed Controller (ESC) take pulse-width modulation (PWM) signals that consist of pulses between 1 to 2 ms long. The code in servodrive.cpp produces these two PWM signals using the output function on two of the NANO’s four direct memory access (DMA) timers. The ESC has a safety mechanism where it only arms itself after seeing several pulses representing a zero throttle value (in this case, pulses 1.5 ms long). One bug I ran into was the result of not transmitting a zero throttle as the ESC turned on. After some debugging, however, I was left with a working RC car prepared for hacking.

At this point, I set up serial communication with an LCD breakout board (LCD.cpp), used the NANO’s Analog to Digital converter to read switch values (main.cpp), and set up an interrupt pin to read ticks on the motor’s encoder (Odometer.cpp). Each of these tasks was a matter of a few lines of code, thanks to NetBurner’s excellent runtime library. Getting information on the car’s orientation was more difficult, however. This information was a critical component to the odometer in tracking the car’s (X,Y) position. Of course, we could have used GPS for this position data but since the Autonomous Vehicle Challenge docks points for GPS usage.

I attempted to interface the MPU9250 Inertial Measurement Unit (IMU) to get absolute heading (i.e., the direction the car is pointing relative to the North Pole). I soon realized this involved a complex procedure that involved I2C communication, testing, calibration, raw data transfer, and filter updates. Rather than taking two weeks to fully understand the complexities of the procedure and writing something from scratch, I ported some open-source Arduino code to NetBurner’s platform. The mainly required changing the necessary I2C functions to NetBurner equivalents, and delays to OSTimeDly() or a HiResTimer delay. I added a Programmable Interrupt Timer (PIT) to update the filter at a specified rate, a zeroHeading() function, and a boot settings function to save calibrations in flash memory.

I defined “heading” as the angle counterclockwise from the X-axis, which points due east to zero. Defining this global coordinate system early-on minimized future confusion when I had to juggle several coordinate reference frames. For more details on the IMU code, see my article “Interfacing the MPU9250 IMU for Absolute Orientation Data”, which links to a cleaner, standalone repository that is helpful for starting your own IMU project on a NetBurner module.

Lastly, I needed vision. The RPLIDAR A2 mounted on top of the chassis claimed to be very fast and precise, with 4000 samples per second (4kHz) and up to ±.5 mm precision. Getting data from it, though, was not trivial. Despite the simple serial communication, the RPLIDAR communicated using a complex data packet system. I ran into a serious bug where the quantity and speed of incoming data was overflowing the NANO’s serial buffer. This was hard to pinpoint, but easy to solve by reading the data at a predictable rate using PIT timers.. At each interval,   all of the available data in the buffer was read and sent to a processing function (SpinningLidar.cpp).

The RPLIDAR was effective in gathering an appropriate amount of vision data for this non-life-threatening autonomous navigation application. However, the RPLIDAR only had about 4m range outdoors. The LIDAR-Lite rangefinders mounted on the sides produced a PWM signal that was measured using the input function of the DMA timers (LidarPWM.cpp). Although the precision wasn’t as good, the 40 m range of the LIDAR-Lites proved crucial for following the parallel walls spaced 16 m apart in the Autonomous Vehicle Competition.

As the project grew more complex, different tasks needed to be performed at various time intervals and priorities. A single control loop in a single process was no longer the best solution for task management. Fortunately, NetBurner’s Real-Time Operating System (RTOS) provided easy-to-use task management features, from priority task-switching to semaphores to hardware interrupts.

To be continued…

Come back next month for part II of the article. We’ll cover the in’s and out’s of the navigation techniques including the “Circles of Concern” algorithms and how the various approaches fared.

At NetBurner, we love to see all of the amazing applications and products one can make with our embedded platforms. Our ongoing project with the Autonomous Vehicle Challenge really highlights the capabilities of NetBurner SOM’s in complex, real-time, low-power applications. As always, comment below with questions or reports on your work in this area. If you use any of the code please tell us how it works out for you.

Cover photo credit:

Share this post

Share on facebook
Share on twitter
Share on linkedin
Share on reddit
Share on email

Subscribe to our Newsletter

Get monthly updates from our Learn Blog with the latest in IoT and Embedded technology news, trends, tutorial and best practices. Or just opt in for product change notifications.

Leave a Reply

Your email address will not be published. Required fields are marked *