The Pixy2 CMUcam can be trained to identify objects by their colors and send their locations and sizes to the ActivityBot 360. This step-by-step tutorial shows how to make the ActivityBot 360° use the data it receives from the Pixy2 to find objects of certain colors and track them in BlocklyProp. Code examples in C and Spin are also available on the Pixy2 product [1] page under "downloads".
IMPORTANT: Follow each step; don’t skip any. You’ll have to carefully follow all the steps in this tutorial for this robot color tracking application to work. Then, you’ll see a totally new personality in your robot and be very proud of the results!
You will need:
You should have completed:
It’s OK to use either power setup (A) or (B) shown below, but do not connect both at the same time. For more info, see Make the Pixy2 Connections.
In Power Supply Setup (A), Pixy2 is powered through USB during configuration. Ground, Sin and Sout are connected to Propeller Activity Board. The Pixy2's power cable is not connected to the Propeller Activity Board WX 5V supply.
In Power Supply Setup (B), the USB cable is removed. The Pixy2's Ground (Vss), Sin and Sout are connected to Propeller Activity Board. The Pixy2's power cable is connected to the Propeller Activity Board WX 5V supply. Pixy2 is powered through the Propeller Activity Board’s 5V connection. The Propeller Activity Board WX must be in switch position 2 for this arrangement (position 2 provides power to the servo header 5V ports). You may also use switch position 1 if you connect the Pixy2 5V header above the breadboard.
The PixyMon software is used to teach the Pixy2 to recognize up to seven color signatures. PixyMon also provides a view from the Pixy2’s camera. Settings for communication baud rate, color correction and other modes of operation need to be configured. The Pixy2 can also be taught color signatures without PixyMon using the white button, but we’ve found this method to be difficult to verify and less reliable in general.
Visit the Pixy2 Downloads [6] and install PixyMon for your operating system. You can also find FAQs, troubleshooting guides, and much more are available on the Pixy2 Wiki.
The following steps are also shown in this video.
You’ll see your colored objects through the PixyMon viewer:
Restore default parameters (under File menu). As you become familiar with PixyMon software you will find yourself tuning colors for your environment’s lighting and background, sometimes moving your robot from place to place - especially to demonstrate to others. It can be fastest to restore the default parameters and reteach the color signatures in your new environment. When do this, you’ll also have to configure the Data out port and UART baudrate settings again, as shown above. These settings are necessary for our example code.
PixyMon should now show the color signature number of each unique block color with an outline and a label:
Getting the perfect color signature matches isn’t easy but you can modify the environment to improve your ability to get quality color signature matches. Try these things:
The Toggle lamp feature is perhaps the single most important option for setting color signatures. If you set your color signatures with the LED lamp ON, then you will also need to turn it ON in the BlocklyProp example Pixy2TerminalDisplay.svg code. The default code example has the LED in the OFF state.
Through experimentation we’ve found the LED lamp should be set ON when there’s no natural light or the room is dark. The LED lamp can usually be OFF when natural lighting exists. Experiment!
Additional controls including white balance and color ranges are also accessible through PixyMon’s configure menu, documented on the Teach Pixy an Object wiki [7]. We’ve found that while the many configurations allow for more precise results they are also not necessary for basic success and functionality. Also see the Tips on Improving Detection Accuracy wiki [8].
You’ve taught the Pixy2 which color signatures to recognize, and assigned each color to a number. In this tutorial, we will load the Pixy2TerminalDisplay.svg BlocklyProp code into the Propeller Activity Board and see each object's position and size using the Terminal. The BlocklyProp program will receive and parse that data from the Pixy2, and then decide what to do with it.
This program has many functions to manage the serial data parsing from the Pixy2. These functions are shown in “collapsed” view (right-click a function if you want to expand it). You will not need to use most of these function blocks; it is convenient to keep their view collapsed.
After loading code into the Propeller Activity Board, BlocklyProp Solo will open its Terminal and display information about colored objects that you have trained the Pixy2 to detect.
Congratulations! Your BlocklyProp program is now ready to be modified and used on the ActivityBot 360 to track objects!
If you find BlocklyProp’s Terminal too slow to refresh, you can use Parallax Serial Terminal (PST) instead, which only runs on Windows operating systems. PST’s display may be paused or cleared and also supports multiple font sizes. To use PST:
A Tiny Tutorial video on using PST instead of BlocklyProp Terminal [11] is also available.
The Pixy2TerminalDisplay.svg program translated the detected color signatures into numbers and displayed them on the Terminal. In the prior tutorial we configured the Pixy2 to see these blocks:
We loaded a program into the Propeller Activity Board WX and observed the following in the terminal window:
How do the two relate? As an X,Y grid with the 0,0 origin in the upper left:
The Pixy2TerminalDisplay.svg program translated the color signatures into numbers and displayed them in the Terminal. This example BlocklyProp code has many functions, but one of them — the show_signatures function — steps through all detected color signatures and displays their signature numbers, sizes, and position.
The key to using the Pixy2TerminalDisplay.svg code example is to be able to use the variables in the show_signatures function.
Vision is complex! Amazon, Google, Uber, Softbank and GM are spending billions of dollars to effectively use vision and sensors for autonomous vehicles. Pixy2 simplifies what your robot sees, yet there’s still a significant amount of data to evaluate when color signatures are detected. Attributes such as quantity of each signature, the signature number, size and location may quickly complicate simple robotic code!Before we use the show_signatures function, let’s get a look at what’s inside:
The repeat blockIdx loop reports the maximum number of color signatures detected, which you taught the Pixy2 with the PixyMon software. The variables within this function describe the properties of each colored block that was detected: blockIdx, blocksDetected, bSignature, bXpos, bYpos, bWidth, and bHeight. Also, note maxBlocks = 3; it’s the 3rd block in the function. You will have to adjust that if you want the program to report more than three blocks.
For each of the color signatures that were detected, the loop stores the signature number, x/y location, width and height in their respective variables. These variables are also overwritten with each loop iteration. For example, if you taught the Pixy2 to recognize three signatures and are only interested in the data from signature 1, you’ll need to add code that uses the data before it gets overwritten by the next loop iteration. To do that, you could add an if block that checks if bSignature = 1 that contains a routine that takes some kind of action in response to the bXpos and bYpos values.
Complex projects are always built in small steps. With microcontrollers, these challenges are best being divided into small pieces with consideration to the outcome. This means you’ll be programming and testing in small steps, saving your programs along the way in case you need to roll back to a prior version. Therefore, let’s start by simply pointing the ActivityBot 360° towards a single color signature which centers the colored object in the middle of the display using its X or Y position.
These are the steps:
Here is an example with a single red block and LED lamp ON.
With a single colored block which has an associated signature assigned in PixyMon, we can program the ActivityBot 360° to point towards the block. Suppose the red block is positioned either left, center or right in the Pixy2’s view. Three simple cases could mean three different programming actions!
The variable bXpos in the “show_signatures” function of Pixy2DisplayTerminal.svg shows where the detected color signature is from the Pixy2’s point of view. The values of bXpos will range from 1 (far left) to 316 (far right).
One easy coding approach would be an IF-ELSE block that evaluates bXpos and either turns left or right towards the block at full speed.
But this is a very clunky control method. A simple full-speed, on-or-off control loop that quickly “startles” the ActivityBot 360° towards the red block, possibly over-shooting the goal of centering the block.
Proportional control is a much more suitable approach. With proportional control, the desired drive speed depends on the distance of the red block from the middle of the Pixy2’s view. If the red block is far left, the robot would set the servo speeds to fast, to move toward the red block aggressively. But if the block is only a few pixels left of center, the robot could respond with a slow-speed driving command to the left. Once the block is centered in the camera's view, the robot will calm down and remain still, making only small adjustments to the speed until it stays still.
We can use the bXpos value to determine how to affect the robot’s drive speed. If the red block is far left or right in the camera view, the ActivityBot 360° will rotate quickly to center the block. If the red block is just left or right of the center, the ActivityBot 360° could respond more slowly, avoiding an overshoot.
Let's review the numbers we’re working with before we look at the code modifications:
We can use the value of bXpos to calculate our servo motor drive speeds. If the red block is on the far left (Example #1 above) and we subtract 158 from the bXpos variable, bXpos s now -157. This is a large number to use as a servo speed, so we’ll divide it by 3 and it’s now -52, a number we can use to drive a servo. This is very convenient, so we’ll use -52 on the left servo motor (to turn it backwards) and the positive value (multiply by -1) of 52 on the right servo motor. This will cause the robot to turn left, towards the object.
If the red block is on the far right (xPosition = 316) and we subtract 158 from this variable, it is now 158. Again, divide it by three and it’s 52. Drive the left motor with 52 and the right motor with -52 and turn right, towards the object.
Download Pixy2_X-Axis_Following.svg [12]
The example code Pixy2_X-Axis_Following.svg is a modified version of the Pixy2TerminalDisplay.svg code, both produced on BlocklyProp Solo. The following changes were made to Pixy2TerminalDisplay.svg:
Using the same approach, you can modify the BlocklyProp example code to drive near or away the colored object, centering it in the middle of the Y-axis display of the Pixy2.
The ActivityBot 360° can also be programmed to track a color signature on both the X and Y axis. This sample code allows the ActivityBot to center the object in the middle of the Pixy2 camera by driving left, right, forward and backward as shown in the video.
Download Pixy2_X-Y_Axis_Roaming.svg [13]
The Pixy2_X-Y_Axis_Roaming.svg project is a modified version of the original Pixy2_Terminal_Display.svg. The following changes were made:
Links
[1] http://www.parallax.com/product/30028
[2] https://www.parallax.com/product/32600
[3] https://www.parallax.com/product/30028
[4] http://learn.parallax.com/tutorials/robot/activitybot/add-pixy2-cmucam-activitybot-360%C2%B0
[5] https://learn.parallax.com/tutorials/robot/activitybot/blocklyprop-robotics-activitybot-360%C2%B0
[6] https://pixycam.com/downloads-pixy2/
[7] https://docs.pixycam.com/wiki/doku.php?id=wiki:v2:teach_pixy_an_object_2
[8] https://docs.pixycam.com/wiki/doku.php?id=wiki:v2:some_tips_on_generating_color_signatures_2
[9] https://learn.parallax.com/sites/default/files/content/AB-Blockly/Projects/Pixy2/Pixy2TerminalDisplay.svg
[10] https://solo.parallax.com
[11] https://www.youtube.com/watch?v=AM0FpjLF4dQ
[12] https://learn.parallax.com/sites/default/files/content/AB-Blockly/Projects/Pixy2/Pixy2_X-Axis_Following.svg
[13] https://learn.parallax.com/sites/default/files/content/AB-Blockly/Projects/Pixy2/Pixy2_X-Y-Axis_Roaming.svg