Intelligent Robotic Arm with Networking: Part 2

robotarm

This is the second part of the two part blog post; It will cover the more technical details of the robotic arm. If you have not read Part 1, you can find that post here. Part 1 gives a general overview of the project and contains a video of the arm in action.

The Duck Shepard robotic arm is designed to pick up objects. It can do this in an autonomous mode and a manual mode. In the autonomous mode, the robotic arm scans its surroundings looking for specific objects. Once it finds them, the arm will pick them up, place them to the side, and continue looking for other objects.

If a user connects to the Wifi access point and uses an Android phone app, they can override the autonomous mode and control the robotic arm manually. In this manual mode, the user can use the smartphone’s gyroscope and the app’s on-screen joystick to control the location of the arm.

The following sections will describe how the robotic arm was controlled.

Robotic Arm

Dynamixel Robot Servos

The robotic arm was assembled with Dynamixel AX–12 robot servos. These servos are controlled through half-duplex UART communication. While regular analog or digital servos only allow you to set the current servo positions, the robotics servos give you much more control, allowing you change the speed of rotation, limit the power draw, and monitor the load on the servos. In addition, a single line of communication is enough to control up to 254 robotic servos, while regular servos require a PWM signal for each unique servo. Five robotics servos were connected to the same communication bus for this project. Each of the servos has a unique ID. ID zero corresponds to the servo that turns direction of the robot. The ID one, two, and three are the servos that control the shape of the robot and the servo with the fourth ID opens and closes the gripper arm.

The NetBurner module controls the robot servos by first sending an instruction packet with a command to the servos. Then, the module receives a status packet from the servos that contains information about the servo. For example, we could send the command “write data” to tell a single servo to set its position and its rotational velocity. After a short delay, the status packet would be returned from the servo providing information about the success or failure of the command. The command that used for the vast majority of the project was the command called “sync write”. This command allows the microprocessor to control many of the digital servos with a single message. This command can set each of the four servos to a unique position with each rotating at a different speed.

If you are interested in reading more details about the servos, here is a link to the manual for the AX–12 servos.

You can find the source for the servos here.

Computer Vision

Pixy color recognition and IR Distance Sensing

To locate and grab specific objects, the robotic arm needed some sort of computer vision. The arm uses both a Pixy color recognition camera and a IR distance sensor.

<sensors
The gripper end of the robotic arm with the Pixy and IR distance sensor attached to it.
signal voltage
The graph that shows the relationship between the signal voltage (y-axis) compared to the distance to the reflective object (x-axis).

The IR distance sensor outputs an analog signal which has a power relationship between signal voltage and distance. To detect nearby objects, the distance sensor was selected to have an effective range between 4 cm and 30 cm.

The Pixy camera is slightly more complicated to use. The Pixy is a camera with a processor that uses pixel information to detect the size and shape of objects of specific colors. You can teach the Pixy camera to detect different hues. In this case, the Pixy camera was programmed to detect the red color of some rubber ducks. Once the camera detects a registered color, the camera will tell you its location in the camera’s view and the number of pixels it is in height and width. This is communicated to the NetBurner module over SPI.

color recognition
An example of the color recognition. The “s=1” means that the red duck is registered and recognized as signature 1.

You can find the source for the Pixy camera here.

Override Control

Wifi Module and Android Phone App

The Wifi module allows the robot to be controlled wirelessly. The module and breakout board can connect directly to the NetBurner development board. Once configured, the Wifi module becomes an access point. Smart phones can then discover it in the Wifi menu as RobotServer.

wifi module
The Wifi module with the breakout board that attaches to the NetBurner MOD5441X.

After connecting to the RobotServer access point, Android users can launch the RobotClient app to control the arm wirelessly. A TCP server running on the NetBurner board will override the arm’s default scanning behavior when it receives a connection from the client. The app user can select between two control schemes: Wiimote style motion-control or claw-game style joystick control. The RobotClient app uses Android’s rotation vector sensor to track motion. This “virtual” sensor uses the hardware accelerometer, gyroscope, and magnetic field sensors to calculate the device’s orientation in the world. Whenever it receives sensor values, the app calculates the angles that the device has been rotated from its initial orientation.

You can find the source code for the Wifi module here and source code for the Android app here.

Robot Control

Finite State Machine

Now to bring all the sensors together to describe how the robot operates. The logic for the robotic arm follows the finite state machine below.

There are 6 states.
State 1: Scanning
State 2: Tracking
State 3: Offset Camera
State 4: Grab
State 5: Place
State 6: Phone (Override)

There are 6 state variables.
Found = An object has been detected by the Pixy camera.
Connected = A client (phone) has connected to the TCP server
Centered = The camera is centered on the target object
Offset = The target object is an offset value below the center of the camera view
Grabbed = The robotic arm has closed the gripper end
Placed = The robotic arm has dropped off the object

You can find the source code for the robotic controller here.

state machine logic
The finite state machine for the logic of the robotic arm.

If the state variable is equal to one, then it means its description holds true. For example, if Found = 1, then an object has been detected by the Pixy. If Found = 0, then an object has not been detected.

State 1: Scanning

In the first state, the robot scans around looking for target objects. To scan, it simply rotates the servo with ID zero. While it is rotating, the Pixy camera looks for objects that it has been taught to detect. If the camera detects something, the state will change to state 2 (Tracking). If, instead a phone connects to it, the state will change to state 6 (Phone).

State 2: Tracking

In this state, the robot will try to face the target object(rubber duck). If the object is too small or it is no longer detected, the state will go back to state 1 (Scan). Otherwise, the robot will keep moving itself until the object is centered in the view of the Pixy. Once centered, it goes to state 3 (Offset).

State 3: Offset Camera

In this state, the robot will try to point the IR distance sensor directly at the rubber duck. This is done by having the duck centered in the Pixy camera’s vision in the x direction while having it lower than the center in the y direction.

pixy camera
This shows the approximate location of the duck in the Pixy camera view in order for the IR distance sensor to be reading the duck’s location.

When the duck is in front of the IR distance sensor, the robot will then use the distance measured by the sensor, the angle of the sensor relative to the table, and the location of the sensor to calculate the coordinates of the duck. It will save the coordinates of the duck and then go to state 4 (Grab). If, at any point, the Pixy camera loses sight of the duck, the robot will go to state 1 (Scan).

State 4: Grab

In this state, the robot will try to grab the object at the location found earlier. To shape the robotics arm, we only have control of the servo positions. We use an inverse kinematics algorithm to calculate the position that each of the servos need to be in to put the robotic gripper end in the target location. The algorithm works by changing each of the servos slightly until the gripper end is in the target location. This process may require many iterations. Once the robotic arm calculates the servo positions, it will put the arm at the location in one movement. When the arm is in position, the gripper end will close, grabbing the target object. The state will then transition to state 5 (Place).

State 5: Place

After grabbing the object, the robot will place it to the side and return to facing forward. The state will then change to state 1 (Scan) where it will begin scanning again.

State 6: (Phone) Override

If a client (phone app) overrides control of the robot, this state will occur. During this state, the robot will receive packets of information that contain the orientation of the phone and the status of the joystick on the phone’s app. With this information, the robot will react by moving the gripper end forwards, backwards, up, or down or it will rotate the robotic arm. There are two control schemes that dictate what the controls correspond to. For example, control scheme zero allows the user to lift the robotic gripper end upwards by tilting the phone upwards and turning the robot by moving the joystick in the x-direction. The robot will stay in this state until the phone is no longer connected to the microprocessor’s server.

phonedisplay
This the phone app interface. It displays the orientation of the phone and the joystick can be moved around. A grab button that closes and opens the gripper end.

Conclusion

Thanks for reading this article.
This was an exciting robotics project with more than a few obstacles along the way. Due to its color recognition camera, the Duck Shephard has been able to successfully grab ducks, but has also tried to grab me, tried to tear outs its own wires, tried to tip water on itself, and pushed its own processor to the floor. I’m mildly disappointed that I wasn’t able to record some of the more entertaining/suicidal tendencies of the robot arm.

To build upon the project, we could replace the color recognition camera with a camera to do actual object recognition. We could replace robotic gripper end with a more versatile arm that is able to pick up a larger variety of objects. We could also make the whole robot mobile by using a battery and placing the arm on a mobile platform with wheels.

In the appendix, you can find how I wired the sensors together onto the NetBurner Module 5441X. You can also find all the source code here.

Appendix: Wiring and Schematics

Initially jumper cables and a breadboard were used to connect everything on the robot. This was useful for development. However, it looked like a rat’s nest after everything was wired together. To clean this up, many wires and connectors were soldered directly to the development board’s prototyping section. All the connections to the peripherals were linked together with rainbow cables to reduce stray wires.

MCU prototype
The microprocessor looked like during the prototyping phase.
The microprocessor after the wire harnesses were installed.

Below is the schematic for the connections. The Pixy camera connects with SPI. The IR distance sensor connects to the analog-to-digital converter. The serial COM port to the PC connects using UART. I needed the serial port connector on another board because the two connectors on my development board were in use. The robot servos connect through half-duplex UART. I connected this to the RS485 headers on the development board because they offered half-duplex mode and would be able to withstand the higher voltages being passed by the 12V servos.

The numbers next to the MOD5441X correspond to the pin numbers in the chip’s datasheet.

schematic
The wiring schematic for the robotic arm.

For a better understanding of the schematic go to the microprocessor’s datasheet. You can find the datasheet for it here.

Share this post

Subscribe to our Newsletter

Get monthly updates from our Learn Blog with the latest in IoT and Embedded technology news, trends, tutorial and best practices. Or just opt in for product change notifications.

Leave a Reply
Click to access the login or register cheese