*This is the second half of our article “**Circles of Concern: A Robotics Spin**“. If you haven’t had a chance to read up on how to get started with designing autonomous navigation using a NetBurner embedded system on module be sure to take it from the top.*

### Part 3: Mapping, Profiling, Simulation

Now that I had sufficient sensor data, I needed to consolidate the unwieldy amount of LiDAR data into a more navigable format. The simplest and most well-studied navigation algorithms in the literature rely on simplifying the environment around a robot into either line segments or a discretized grid of occupancy probability (i.e., how likely it is that a cell has an object in it). I attempted to map the environment in both of these fashions in a manner that consumed as little processing power as possible.

For line segments, I took an array of length 360, representing what the car sees at each degree and found segments by “cutting” the array at zeros. I recursively used a simple line fitting algorithm to find the segments and split them at the datapoint point farthest from the fitted line if the point-to-line distance was greater than a specified threshold. Rather than performing least-squares fitting during this segmentation process, I simply used the line formed by the endpoints to save processing power. Segments of a single point were turned into “circles.”

For the occupancy grid, I constructed a 2D array of size 128×128, where each cell represents the probability of occupancy of a .25 m^{2} area. Grids like these are generally updated using Bayes’ theorem, which defines the new probability of X given the previous probability of X and new sensor data. This becomes simpler when we think in terms of the “odds” of an event, that is, the probability of *X* divided by the probability of *NOT X*.

For example, we might initialize the grid with 1:2 odds. That is, it is half as likely that there’s something in a cell as not. If we have a LiDAR reading of 3 m straight in front of the car, we can update the respective cell by multiplying it by 3:1. By doing that, we are claiming that it’s now 3 times as likely that there’s something in that cell. We say that our “Bayes’ Factor” is 3. The new odds that there’s something in that cell is 3:2.

For my implementation, I needed to consider that my position accumulated error over time and that the environment could change rapidly. If I went to coordinate (2,0) initially, drove around, and came back to the “same coordinate,” the environment would likely look different. Because of this, I wanted to ignore old information after a point. To implement this, along with reducing computation, I restricted the cells to the set of integers 0 to 10, which represented the relative probability that a given cell was occupied. I added to or subtracted from cells as an estimate of Bayes’ theorem rather than using multiplication, which would have been more computationally demanding. This results in a finite state machine (FSM) that guesses whether a given cell is occupied.

I observed that all mapping would fail after a certain amount of time and guessed that I was reaching the limit of the NANO’s processing power. Thorough profiling and optimization was key to solving this problem. Read more about this in my article, “Intro to Profiling and Optimization.”

As the project grew more complex, I soon realized that printing out the car’s state would no longer be sufficient. In order to see the results of mapping algorithms or do any sort of debugging on navigation algorithms, I would need to evaluate the car’s decisions in the context of the car’s position, rapidly changing raw sensor readings, and processed maps. The easiest way to do this was to log variables in real-time and later visualize the car’s motion and vision.

In 2017, Paul created a modular and robust logging system that logs any C data type in binary (for speed) in flash memory. This log is designed to be transferred to a computer for post-processing via FTP. A command line tool then converts the binary data into a CSV file. Read more about how you can use this system in your own project in the article, “Real-time Data Logging For Embedded Systems and IoT”.

I wrote a MATLAB script LidarVisualization.m that took the logged data as vectors and created an animation of the car’s motion, plotting the car’s position and orientation on a 2D coordinate system. It draws green lines for visualizing lidar readings and lines for visualizing the map created by segmenting and fitting those lidar readings. Another script OccGridVisualization.m visualized the occupancy grid with a grayscale pixel grid, where a black pixel indicates maximum cell occupancy probability (there’s something there!) and a white pixel indicated the minimum (there’s nothing there).

These visualization tools helped me debug various issues with the mapping algorithms, such as a failure to appropriately convert between frames of reference and a segmentation failure after 15 seconds of continuous mapping.

All in all, my mapping algorithms were moderately effective, but needed a significant amount of fine-tuning and testing. I ended up deciding that simultaneous mapping and navigation would take up too much of the NANO’s processing power, and that this approach was not necessary for the NetBurner AVC. I ended my work on mapping and attempted to navigate by using raw LiDAR readings instead of consolidating LiDAR readings into a list of segments or an occupancy grid.

### Part 4: Navigation

I moved to developing a toolbox of various navigation methods that could later be used interchangeably. First, I implemented basic proportional control waypoint navigation, which involved multiple reference frames, some trigonometry, and proportionality constant tuning. This worked perfectly in the absence of external magnetic fields that interfered with the IMU. Next, I implemented a wall following algorithm that also relied on proportional control, but where the offset error used in the algorithm was the change in the measured wall distance divided by the distance traveled. This worked perfectly for maintaining distance to a wall, but it was difficult to tune the parameters to gradually steer toward or away from a wall.

Next, I worked on editing the waypoint navigation to avoid obstacles. I attempted to write an artificial potential field algorithm. Artificial potential field algorithms are inspired by physics: a particle will feel attractive or repulsive forces that are inversely proportional to the square of distance from other particles. That is, a particle will follow a path that minimizes a gain in potential energy. For example, imagine a very spherical boulder tumbling down the Grand Canyon. Unless the boulder gets stuck in a huge hole (a “potential well”), it is likely to eventually continue rolling downward until it reaches the lowest point of the canyon.

I wrote an algorithm to simulate this phenomenon. It summed 360 “forces” on the car, each of which had magnitude 1/(LiDAR reading^{2}) and direction opposite that of the LiDAR reading. It then added a constant force in the direction of the goal point. Unfortunately, I could not get it to work well and saw erratic and unpredictable behavior in testing. It relied on a very fast update rate to adjust direction as it approached obstacles, so I guessed that it might be having performance issues, as I saw with the mapping approach detailed above.

The algorithm gave the NANO an extra 1000 multiplications, 1000 int-to-float conversions, 1000 divisions, and 2000 additions every second. It was easy to blame problems on things out of my control, but I later realized that the computations were nothing compared to the NANO’s ample overhead, with 385 million MIPS assembly instructions per second (MCF5441x ColdFire Microprocessor Data Sheet, pg. 4). I don’t really know what caused the problems, because I no longer had time for significant debugging, and I needed to be OK with that.

I went back to the basics. The task was to navigate from point A to point B given the car’s position and orientation and an array of 360 LiDAR readings (integers giving the distance in cm to a reflective surface), where each element in the array held the most recent reading at the respective angle in degrees. First, I had the car brake and stop when the LiDAR reading directly in front of the car was below a threshold. This worked surprisingly well, so I expanded it to look at multiple LiDAR readings around this reading, which also worked well.

To move onto actively avoiding obstacles, I started with a greedy algorithm that adjusted the previously calculated desired heading to the nearest free heading, that is, the nearest direction where no obstacle was seen. This also worked surprisingly well. In tests, the car would often graze the corners of boxes. However, this left the car with a limited field of vision. I needed the car to plan a successful route even through a complex web of obstacles. To do this, the car needed to find a wide range of “free path” LiDAR measurements at closer distances and a narrow range of “free path” LiDAR measurements at father distances. Solving this problem was best accomplished by making “closer distances” and “father distances” discrete distance blocks, or “Circles of Concern.”

In “The 7 Habits of Highly Effective People,” which has nothing to do with autonomous navigation but is a great read for interpersonal growth, Stephen Covey discusses the concepts of “First Things First” and “Circles of Concern.” This served as the inspiration for my obstacle avoidance algorithm, which first assesses the “concern level” of obstacles. We construct several circles of concern, varying from very close to very far. If there’s an object obstructing the desired path very close by, we consider that and nothing else- it’s our immediate concern. Otherwise, we’ll consider obstructions further out. Then, we take the desired direction and find the nearest free arc of size that will fit the car.

The car’s speed is decreased and steering sensitivity increased for more immediate levels of concern. For the most immediate levels of concern or failure to find a free arc, the car will reverse to provide some space for its limited turning radius or to allow it to try a different path. This ended up being the winning algorithm. It wasn’t perfect- switching between forward and reverse made position tracking less accurate and it was not guaranteed to choose the best path. However, the car quickly reached the end goal the vast majority of the time with little computation, relying on only raw LiDAR measurements. This makes the algorithm useful for racing applications, emergency swerving maneuvers, and other real-time “sense and navigate” applications.

*At NetBurner, we love to see all of the amazing applications and products one can make with our embedded platforms. Our ongoing project with the Autonomous Vehicle Challenge really highlights the capabilities of **NetBurner SOM’s** in complex, real-time, low-power applications. As always, comment below with questions or reports on your work in this area. If you use any of the code please tell us how it works out for you.*

Cover photo credit: Long Beach Comic Expo 2012 – K.I.T.T. from Knight Rider

Author The Conmunity – Pop Culture Geek from Los Angeles, CA, USA