I’m trying to attend the monthly meetings of the HomeBrew Robotics Club since around this summer. They are quite fun when you’re interested in hobby robotics and can make it to the San Francisco Bay Area π During one of my first meetings I learned about the tablebot challenge. That sounded like an interesting task to work on. I played with partially Lego Mindstrom NXT based robots before and I had seen the BrickPi board from Dexter Industries some time ago. That combination sounded like a good idea and would become the base of Dexter. Two drive motors and a roller in the back in a simple skit steer configuration and one motor for the gripper. A Raspberry Pi B+ was becoming the brain of the robot.
Adding a CMUcam5 “Pixy” to see the coloured ball and some colour codes marking a goal box also sounded doable. But this turned out to be more problematic than I anticipated. The colour recognition of the Pixy camera is very light dependant. After differently coloured balls etc, I settled eventually on a purple cardboard cube (it’s build around a Styrofoam core for longevity π .Β But even so, I have to teach the colours again for daylight or florescent light. The Pixy talks via USB to the RaspberryPi. But I did not get the PixyMon (graphical display/configure program for the Pixy) to work nicely on the Pi π So every re-teach requires a re-wire to the laptop π
To detect the edges of the table, I decided on Pololu IR reflectance sensors. I used eight sensors, two in each corner to make sure, the robot does not drive off the table sideways. To get good results from those sensors, you need a good and predictable timebase. So the Pi did not sound like a good fit, running a complete Linux and doing other stuff as well. That’s why I used an Arduino Micro (any Arduino or single purpose MCU would work well π to do the time critical things for those sensors and basically use a very simple protocol over USB to get the info to the Pi. The Arduino takes the raw sensor readings and decides if each sensor `saw’ the table or nothingness and sends `one bit’ for each sensor upstream. Before each run, the Arduino does a calibration so that it can account for any light differences etc. The sketch is here: TableBot.tar
Somewhere along the way, I decided that I needed some way to monitor what the robot was doing. Especially with the camera not exactly doing what I expected. So I added a littleΒ Β Adafruit PiTFT display. That works on the same connector as the BrickPi (with some ribbon cable π since both use different pins. The BrickPi is using the serial connection and the PiTFT uses I2C, SPI and some GPIO pins. The display works as a 320×240 X11 display so it was pretty easy to add some Xlib calls to draw a rectangle where the camera saw the cube or the colour codes. That helped immensely in figuring out that the communication between the Pi and the Pixy is somewhat slow and that the camera is lighting dependent.
I quickly realized that the BrickPi software (the firmware on the ATmega’s and the driver parts on the Raspberry Pi) did not really do what I expected. Especially since I had seen what the ATmega (Arduino base) should be able to do. The original did only very little in firmware and the Pi was supposed to closely watch. So I decided to push more responsibility to the two ATmega’s which build the base for the BrickPi. My firmware is probably far from perfect, but anyhow, it’s available here on github:Β BrickPi . It’s a fork of the original and I only looked at the motor controls so far. No guaranties π I updated the C driver header file and the examples. That repository lives on github as well: BrickPi_C.
The vidoes start during the sensor calibration, so the robot takes a moment to start moving π
It’s again a fork of the original, but it also has all the sources for my Dexter π Project_Examples/MW/w1.cpp is basically the program for the first step of the challenge, driving from one side of the table to the other. My tablebot can not really drive straight, sinceΒ the motors don’t match very well π So that program let’s it bounce around the table more like a billiard ball π
The Project_Examples/MW/cam.cpp is for the second step. The robot does turn around until it spots the purple cube, drive to it, grab itand then drive to the edge of the table and drop the cube. after that it would return somewhere more to the center of the table and look for the next cube π
The Project_Examples/MW/goal.cpp is for the third step. The robot would look for the cube like before but once it grabbed it, it would start looking for a goal colour code and then drive there and drop the cube. The second clip shows the display in action for a couple of moments. You can see the red rectangle representing the purple and later a yellow rectangle representing the colour code.
If you found the strange `antennae wire’ in the previous two clips, that was an attempt to use four wight LEDs as kind of a headlight to aid the camera. But that did not work as well. The camera saw those LEDs as pretty blue and it started to see the grey grippers as the `cube purple’ and got confused π But I still have a little clip with the headlights on π
This post grew longer than I expected. But I might add more info in the next couple of days. Maybe I forgot something important before Dexter is taken apart again π
One more thing, Dexter runs on a RC LiPo battery, 7.4V, 2200mAh. I found that those batteries work pretty well with Lego NXT pieces as well as Power Function motors. Much better than the 6 AA’s or even NiMH’s. And the other advantage is, this little battery has the equalizing connector which can also be used as kind of a slow-charging connector. So I can charge that battery in place π
As always, have fun exploring
— Marco