Epic Race at the Sparkfun Autonomous Vehicle Challenge 2017 – Part I

herbieNB

It’s amazing what can be done with a NetBurner Embedded Core Module, creativity, and some ingenuity. The digital and analog world can ALL be yours… or at least you can do something super cool! At NetBurner we feel the annual SparkFun Autonomous Vehicle Challenge (AVC) is a perfect opportunity to do something we as a team love – making robotic vehicles and putting the NetBurner products through some punishing field testing! A big shout out to SparkFun for making this their 9th annual event – its concentrated awesome on many levels.

Part I of the “Trilogy”

NetBurner competed in the Colorado-based SparkFun AVC in 2012, 2013 and after a hiatus, we were back at it for the 2017 Denver Maker Faire. Our lead “pit chief” and race engineer, Paul Breed (@unrocket) has been our standard bearer at these amazing events. His arduous work and unlimited caffeine supply combined with the rest of our support team in San Diego has produced some extraordinary accomplishments that have contributed to our company culture and helped improve our products! We’ve even had the honor of bringing home the Gold at the AVC 2013 and this year came so close with the Silver – not too shabby.

We’re going to share this year’s NetBurner AVC 2017 story through Paul’s eyes (no they are not cybernetic eyes…yet) and give you a sense of the dramatic ups and downs, as well as the technical approaches, glitches, last-minute hacks, and lessons learned. You’ll also get a sense of our NANO54415 Embedded Core Module’s capabilities, which served as the brains of our autonomous beast. Our hope is to provide a bit of high-stakes, yet geeky, entertainment mixed with enough technical information to get you rolling on your own platform. See you at the races!

Planning and Preparation

It’s always good to have a plan…and as good engineers we had one, but as the story unfolds, you’ll see that reality unraveled differently. Sparkfun AVC has quite a lot of rules (besides “two cars enter, one car leaves”) which affected the build requirements as well as the win strategy. To make things even more challenging, the windy race track also contained obstacles like barrels, haybales, ramps, and even hoops to drive through for extra points. You can check out the AVC 2017 rules and track specs here if you’re curious. One of the most interesting rules was a substantial point bonus for entries that did not use GPS – this clearly stepped up the technical challenge as well as the stakes… I couldn’t resist.

3D model of AVC 2017 Track: Pedestrian Crosswalk left, drive through hoop center green, ramp center blue, barrel obstacles right red.

The plan was to build two identical cars that drive without GPS and race each of them over the two days of the AVC. A single car could only race on a single day so having two cars racing over two days gave us some elbow room. Each car would feature the NetBurner Nano54415 and leverage its embedded CPU and RTOS as well as its ample digital and analog IO. Instead of GPS, I found a way to integrate off-the-shelf LIDAR modules and 9 degree of Freedom (DOF) IMUs. Using my custom software and the published race track surveys I could devise a reasonable multi-modal navigation capability. For the chassis of this “Frankenstein” AV I chose the super-fast 1/7 scale Trazzas XO-1 Radio Controlled (RC) race car. With an advertised 0-100mph in under 5 seconds its seemed like a smart choice….at
the time.

I built two identical cars built on Traxxas XO-1 1/7 scale Chassis…

High-level system diagram and under-the-hood diagram of the NetBurner AVC 2017

CPU / SBC

The star of our story is the NetBurner NANO54415 Core Module that served as the brains and nervous system for this build. The NANO is a really practical board that serves as a Single Board Computer (SBC) with a great balance of processing capability and IO. It’s low-power and low-cost with a small footprint. To name just a few of its many great features, it offers lots of analog and digital IO, ADCs, DACs, serial inputs, 64 MB RAM, Flash memory, ethernet with a great embedded 32-bit processor, RTOS and industrial temperature range. You should check out all of the specs and pricing here.

Rotating Lidar

The rotating LIDAR is the primary obstacle detection sensor. It sweeps around in a 360-degree circle and maps things out. It has an effective range of about 16 meters indoors and 3 to 5 meters outdoors in full sun. We used an RPLIDAR A2 for roughly $310 from www.robotshop.com. My concern was that with such a short effective range in sunny outdoor settings, it would not be sufficient or even saturate completely.

Side-Facing LIDAR

A side facing LIDAR was also integrated as a fallback in the case the rotating LIDAR was saturated by the sun. It does ranging to objects in one direction out to about 40M. The one on our buggies are a Garmin product called LIDAR Lite V3.

IMU

No bias towards the race officials here at all (cough, cough) — we decided to use the SparkFun MPU9250 9 DOF sensor. It’s truly a formidable and reasonable priced IMU with 3 gyros, 3 accelerometers and 3 magnetic sensors. Compared to the quality of the gyro outputs available 10 years ago when I was working on hovering rockets, these gyros are WAY better.

RC Receiver

During development one needs to first drive the car manually and take sensor data to see what’s happening. It’s also essential to manually drive the vehicle around the course to map it with its own sensors. I accomplished this by using an RC receiver. Traditional RC receivers put out servo pulses and are a bit hard to read. However, the DSM series of RC receivers have an internal, not formally documented, serial protocol which I used to my advantage. You can take on the DSM satellite receivers, hook it up to a serial port and see serial data rather than 3 or 4 individual pulse streams, which is exactly what I did. The NetBurner NANO is a perfect accouterments for this since it, like all NetBurner core modules, readily interfaces with serial inputs.

The RC receiver is a Spectrum DSM2 satellite receiver. I paired them with a normal Spectrum receiver and then used them as a TTL serial device paired to a normal RC transmitter. I have not found a version of a ground transmitter that has usable satellites, so my RC TX was an aircraft unit.

Odometer

The odometer is a magnetic sensor buried in the rear gear case of the XO-1 Chassis (not shown in diagram). It basically gives me a pulse for every 4.58” inches of travel.

One of NetBurner’s AVC 2017 entries showing the major components.

Simulation
In order test this monster in the lab, I wrote a simulator. It took in the SparkFun track survey and simulated the sensors used on the rig. This simulator allowed me to start developing the car code. It allowed me to log and display data in a browser as well as map the course and proposed path. A short video with more on this simulator and how it uses the LIDAR inputs can be found below. Over the next few newsletters, I’ll pick a section of this code and do an article about it.

I thought I was close to ready as the car was following complex arced paths well. The picture below is from real testing the Wednesday before the first race on Saturday. The black and pink arcs (actual course vs navigated course, respectively) are superimposed. This was great as it showed the autonomous vehicle is in fact navigating where commanded.

When I initially ran the car, the navigation algorithms weren’t working very well; there was a lot of overshoot. There’s a video below of when I first worked on this during development. Ultimately, the change that enabled the precision shown above was to add 200 msec of position prediction to the turn algorithm. Specifically, the solution was to initiate turning 200 msec early, as this seems to be the latency experienced when we command the servo to actual steer to a new position.

In addition, I was doing all the real-world testing between some steel frame buildings with nice straight walls. The rotating laser scanner was correctly finding the walls and correcting both the horizontal offset and accumulated heading errors against the walls. It would sense the wall and it knew its path was supposed to be 3 ft from the wall, but when it measured 4 ft, it would move its estimated position 1 ft to accept the measured wall as truth. It would also note the angle of the wall vs what it expected the angel of the wall to be, and correct any gyro heading drift against it. This worked really well on regular straight walls… not as well when walls where composed of rough haybales.

After a couple months of pre-race development and preparation at NetBurner HQ in San Diego, I did some final testing, loose-end tying, and packed up to take to the skies. Ready or not, I was headed for the SparkFun AVC2017 at the Denver Maker Faire to take home the Gold!

Read Part II Next

Share this post

Subscribe to our Newsletter

Get monthly updates from our Learn Blog with the latest in IoT and Embedded technology news, trends, tutorial and best practices. Or just opt in for product change notifications.

Leave a Reply
Click to access the login or register cheese