Marco's Corner

Just another Software Developer's Musings


I know about the different FIRST competitions for quite some time.  I even asked our kids about it a couple of times while they were still in school. But I guess, I was not enthusiastic or  aggressive enough to get a team started:-(
Since I joined the Home Brew Robotics Club mailing list, I saw the “Call for Volunteers” emails flying by every year. But it never really fit in the schedule until the Modesto First Lego League event.

So, I went to the FIRST volunteer site and signed up, did the background check and picked some jobs which I thought, I could do without having any experience how such an event works.  I got accepted, but there was still no indication on what I would be doing. So on Sunday morning, I packed my laptop and camera and went to the Fred Beyer HS in Modesto.

During the opening ceremony, the announcer asked for volunteers as referees for the actual games. Apparently some people could not make it. Since I did not have an assigned role yet and have some experience with “hobby robots” in competitions, I raised my hand and got the job;-) That was the beginning of a very fun but also stressful day;-)

If there was a `learn by fire’, the warm-up rounds were it for me;-) I had not looked at the ANIMAL ALLIES rules beforehand, so I was watching the teams while also trying to understand all the pieces of the game setup. The first break helped, so I could read the rules;-) Overall, I think, with the help of all the others, I did a reasonable job. But I would definitely not recommend that as normal `preparation’;-)

Thanks to the La Loma Green Machine Robotics Team and Mr. Ollar for allowing me to use some of their photos here. While I had my camera, I did not get around to take many pictures during the day. It was simply too busy. Their photo album is here.

Overall, this turned out to be a very fun-filled day for all involved. But I was very tired in the evening.
If you’re thinking about it, I would suggest to try to attend one of those events. It’s always fun;-)

— Marco

Oracle Mini Maker Faire 11/2016

Oracle Mini Maker Faire 11/2016

I did not know how wide this would be publicized but since I made it into the official Oracle blog, I guess it’s OK;-) Yes, Oracle had a company based Mini Maker Faire;-) All the people I saw, had a blast. But unfortunately, I did not have time to take any photos. I was busy with all the visitors at my little table explaining my different robotics experiments;-)

Here is the link to the official Oracle post about this;-)

Have fun,
— Marco

Our kids are off to college now but I could not stay away from the Science Olympiad fun. So I plan to play the Event Manager for theimg_20161017_235808Division C – Electric Vehicle” @ the San Joaquin County Event.  That will be my first time to run an event, so it will be quite interesting.

The event does not need to much preparation except for a pretty far reaching timing setup. Schools might have the Vernier Photogates which can be used with a laser pointer as a long-range solution. But I wanted to try to build my own `lower cost’ solution for people outside the Vernier eco system;-)

I build some little photogates earlier for testing our Maglev vehicle for 2013. They worked fine for that setup, but they would not work for the Electric Vehicle event because their short range.

So I knew, I needed some laser diodes as light sources and a receiver side which would work with them. For the diodes, I found little diodes @ebay, for instance 5* 5mW for Arduino for < $8 , but there are many choices out there. 5mW is what the normal laser pointers are using and it is more than enough for the distance we need. I did choose a red beam to make the lines visible. With a IR diode, I might have been able to use my original detector, but it would be very hard to aim:-(

laserreceiver_schemFor the receiver side, I choose the SFH 3310 photo transistor. It’s relatively cheap @ $0.83/per and it seems to work pretty well. I added another transistor and some resistors and LED from the parts bin to the receivers to that I can quickly see that the link is closed. The final circuit is shown in the schematic. Overall electronic parts cost for two gates is < $10.

I designed little enclosures for the receivers to hold the electronic pieces and also help to keep ambient light away from the photo transistor.  They were 3d-printed but needed a little bit work to make them fit as I wanted. I also printed little adapters for the laser diodes to fix them to LEGO stands. 3d

On the software side, it’s almost the same as the original gate setup. The receivers interface to an Arduino which does the timing. The laser diodes have an Enable pin, but so far, I left them simply on. The updated sketch is here: lasergate It also includes some local LEDs on the Arduino to see how it sees the receivers and a little display to work without a big laptop/display. Both are not really needed, but nice to have.

 

I also rebuilt my adapted dclock widget dclock-2-2-2 and it still seems to work on Ubuntu 14.04;-) I was running it on my laptop, but it should be easily buildable on a Raspberry Pi or similar;-)

Overall, I’m very happy with my setup. I tested the beam range to > 4 meters (the length of the wires I was using) and they worked just fine;-) So, that setup should work just fine;-)

As always, have fun exploring;-)

Update 11/09/2016: It turns out, my original scetch was sometimes slow to react. So I updated it to limit the LCD updates to about ten per second. That helped a lot. I now reliably pick up a 6mm (1/4″) obstruction at more than 6m/s (13mph). That’s fast enough for me;-) If it’s still a problem, I would remove the running updates for the LCD completely and show only the final time.

The new sketch is here: lasergate_20161109

3D Printing with NinjaFlex

No comments

img_20161029_155112_1024 I found those nice little lighted costume horns @ the Adafruit tutorials. So I had to try to build my own pair;-) They were 3d-printed out of a nicely flexible filament called NinjaFlex. Playing with that opens a whole new area of potential uses. But first, the printer has to be able to handle it. My little HobbyKing Turnigy Fabricator printer has an open space between the filament drive and the actual hot end. I learned very quickly, that the NinjaFlex needed extra guidance there. (Normal ABS filament works just fine).

My first attempt to overcome that problem, was a little piece of Teflon tube. That kind of worked, but eventually, the filament would push the tube to one side and then out of the drive. So, I was on the right track, but needed to stabilize  the tube more. Since this is sitting right on top of the hot end, I did not like the idea of ABS, so I opted for a bit of wood;-) Cut to the right shape and the little wood piece works perfectly. The wood piece is just laying there but that’s enough;-)

After the img_20161029_155647_1024mechanics were working, the next step was (and still is), to find the right parameters for Slic3r to create successful prints. The first attempt with layer heights of 0.1mm worked OK for a little test cube, but not so well for those horns:-( I was also printing way to fast.

My latest parameters are:

  • Retract:
    • 0.1mm (probably to be increased to 0.25mm or more, still a lot of oozing)
    • lift 3mm, without that, the moves sometimes bend the piece over when moving
    • 40mm/s speed
  • Temperature
    • 230C
    • Bed 40C, should even work with unheated bed.
  • Printimg_20161029_155103_1024
    • 20 mm/s speed seems to work ok
    • 0.25 mm layer height

The last horns are still not perfect, but they look interesting enough for their costume use;-) So, I’m happy for now. When I need to print more form stabil pieces as a seal or possibly as dampening elements, I might pick up with those parameters again.

Theimg_9566 first trial run at the little Halloween potluck @ RobotGarden in Livermore. The LED strips work but since the ambient light is pretty bright, not much can be seen:-( Thanks Andra for sharing the image;-)

 

Have fun;-)

— Marco

 


OK, this tooArm1k quite some time but some info to our entry for the RobotArm event for the Science Olympiad Division C. This robot is probably overkill for the event, but maybe other teams can find some interesting tidbits for the next iterations of the event. After our first attempt with the team of our daughter in 2013, this was the second version with the team of our son. In the three years in between, we got somewhat more into the whole robotics area;-) I competed two times in the RoboGames and overall learned a bit more about the whole area.

So this year’s robot arm is using mostly Robotis Dynamixel AX-18 & AX-12 servos. They work much more consistent than the normal RC servos, but they are also much more expensive. Most mechanical parts were laser-cut @ RobotGarden, a maker space in Livermore, CA. The `gripper glove’ was 3d-printed;-) Both of those things were not really available three years earlier. The FreeCAD drawings for all are included in the so_arm repository. The gripper itself is one pre-made part, the LynxMotion Little Grip Kit with the only RC servo in the arm.Arm2

The brain stem is a little  Robotis OpenCM 9.04 board with an adapted Arbotix firmware. The idea behind that little board was good, but it looks like there is not too much progress with the OpenSource Support:-( In 2013/14, there was some momentum behind it, but now it looks pretty dead. Why the Arbotix firmware? It makes the connection to the higher level Robot Operating System (ROS) very easy, in this case via a Serial-USB connection.

ROS is used in many different robots in research/education as well as in production environments. It brings many features for higher level control and can be adapted to new robots as needed. So this was a good way to explore it’s features. In the process, we learned a lot with the help of Patrick Goebel’s “ROS by Example” books😉  They show a lot in a `simulated environment’ as well as with real robots when you happen to have the right hardware. But we were able to adapt things to our arm as well.

The first idea was to use a Raspberry Pi2 as ROS computer. I use such setups for some driving rovers. But we quickly learned that the path calculations were too much for the Pi. So in the end, ROS was runArm3ning on a laptop with Ubuntu 14.04 which worked very well. If you watch the video above carefully, you see the model of the arm in RViz on the screen before the laptop turns the screen dark (on batteries).

The Arbotix ROS repo is cloned from the original with my changes and the CM firmware.

The SO_Arm repo contains all the other ROS source packages (some modified copies of the rbx arm packages) and the mechanical designs.

And the result of all that? A second place at the San Joaquin County event;-) The first place went to a `two arm, remote controlled setup’. But overall, we learned a lot;-)

As always, look to explore something new;-)

A little update: The thumbnail images actually hide some larger versions with extra notes. They are very hard to see on the little versions and I got some comments about that.

Over the lasspamt couple of days we saw a big increase in SPAM bugs and SPAM comments in a Bugzilla installation. After the initial rush to stop that influx and clean up the mess, I took some time to look around and try to find some help to fight that issue at the source. Basically trying to filter/reject those bugs/comments right at the submission. Unfortunately, I did not really find much  🙁

The next step was to look at the sources (thanks for Open Source Software 😉 and see how hard it would be to to add a classification setup on my own. Bugzilla has a nice setup for hooks, but unfortunately there were none that fit my purpose. So I decided to add my code right into the normal sources. It turned out, that the majority of the additions were limited to two spots in one file, after the normal validations for the bug creation and the comment addition. I made some more changes related to configuration of the new service and to the error handling/text. But overall, it was quite easy and quick 😉 Thanks to another piece of Open Source Software: the CPAN Net::Mollom module.

The bz_4.4.11.patch was done against Bugzilla 4.4.11 and the resulting installation was tested locally with some simple messages and a free Mollom account. It seems to work as expected, but I don’t yet claim a real-world deployment. That would probably cost some money depending on how large the installation would be.

An additional Net-Mollom.patch to the Net::Mollom sources allows it to use the proxy configuration from Bugzilla. I thought, this would make things easier if the installation would need a proxy to reach the Mollom servers.

This whole exercise was just an attempt to see how much effort would be needed to implement a service-based SPAM classification for Bugzilla. This blog uses Akismet to classify comments and I believe my code could easily be adopted for that or even any other service. The most important part was to find the right spots where all the information is easily available before anything is committed to the DB. I think, I found pretty good spots 😉

As always, have fun expanding you horizon 😉

Our swim season is almost over, so here another post about my CTS console experiments.

I tried to do two things for the scoreboard connection:

  1. Create a wireless link to the scoreboard.
  2. Understand the protocol enough to build my own example scoreboard.

So far, I was only partially successful.

IMG_20150717_114928_1024I know the link is running a RS-232 with 9600Baud 8-E[ven]-1. So that’s the first hurdle. My firmware for the 3DR telemetry radios should be able to handle that and it seems to work for a while. But eventually, the radios loose the wireless link and then wait a couple of seconds before re-establishing it:-( So the wireless link is somewhat jumpy so far:-( Of course, I needed a MAX3232 breakout board as well between the console and the radio, very similar to the link to the computer. But I had to add a 820Ohm resistor into the RS-232 receive side. Otherwise, the MAX3232 would get very hot:-(

IMG_20150717_105839_1024I also logged soIMG_20150717_105815_1024me of the transmissions from the console and I believe, I have a pretty good understanding of the used protocol. I was able to write a little Arduino Mega (or ADK) sketch to handle it and drive a little display and send the decoded strings to an attached computer for more interesting processing;-) I used a Mega because it has multiple hardware serial ports;-) I decided to use “Serial” for the connection to the computer and “Serial1” for the connection from the Colorado console. My sketch is here: ScoreboardEmulator.tar I used three of the Sparkfun four-digit 7segment displays, but I only drive eight digits and one colon right now. That would also be hard on an Arduino Uno or similar.

P1000673_1024Right noP1000670_1024w, the little display shows the Colorado scoreboard `0F’ channel but all channels are sent to the computer.

Using a different display setup, that Mega would be able to drive a complete ten+ lanes display;-) Sparkfun has 6.5″ digits with a storing driver board for about $20 each. You could probably find them even cheaper directly from China, especially when you want to build a 10 lane display which would need about 100 digits. I tested mostly with the older Colorado System 5 but the System 6 is working as well;-)

So, overall more to think about;-)

Have fun,

— Marco