Hello everyone! I have posted video and pictures of my articulated arm robot with grasping end effector online. I apologize for the poor video, many of you will recognize it as Vision Command video. It is my only source of creating digital video. It took me about a month to complete, and I am very happy with it. All my Lego life I have been working on a good version of a robot of this type and I have completed it. I haven’t built anything for about six or seven years, but I thought about it often and wished I could get back into it, so I took the time and I am grateful that I did. I hope you enjoy it Details, details, details……… Programmable bricks: 2 RCX #1 runs the linkages and the turntable, and respective rotation sensors RCX #2 runs the end effector and the three home touch sensor (stacked on input 2) Version: RCX 1.5 Motors: 4 (shoulder, elbow, turntable and end effector) Sensors: Rotation: 3 (shoulder, elbow and turntable) Touch: 4 (shoulder home, elbow home, turntable home and end effector open) Language: NQC Interface: RCXCC Vertical Reach Desk height to about 18 inches Horizontal Reach: About 3 inches from base out to about 16 inches, depending on angles and positions. Rotational Reach: 360+ degrees, the only limit is how much the wires get twisted inside the base, which can be multiple rotations in either direction. Homebrew Sensors: Zero, pure Lego. Non-Lego parts One part, inside the turntable I used a cable wrap to avoid the spaghetti effect Initiating device: Lego Mindstorms remote control Mechanical Details I have built the elbow/shoulder device shown in the pictures many, many times trying to find the best arrangement, looking at details such as speed, backlash, slip, how the device will interface with the rotation sensor, etc. The solution I used was to drive a worm gear directly with the motor and extend the shaft through the rotation sensor. The worm gear drives a 40 tooth spur gear with the linkage attached directly to the spur gear. Using a worm gear is great because of the high torque capability and when the gear stops, it holds high loads well without putting a load on the motor brake. During programming, I learned that this speed had two drawbacks; the motor would overshoot the stopping point of the rotation sensor by about one to two ticks or it missed altogether, and the robot tended to bounce a lot. The solution was to put an 8-tooth gear on the motor and have it drive a 24-tooth gear on the worm gear shaft. This worked very well, the robot stopped shaking on start and stop and the rotation hit the mark every time. I have also built the end effector many, many times with many of the same details listed above. I chose this solution for several reasons; I could run the shaft parallel with the arm, since it uses a worm gear, I could run the motor at full speed without crashing immediately, and I could fine tune the two fingers to touch precisely at the midpoint when the end effector is closed. A major hurdle was to sense open and closed conditions, which turned out to be pretty easy. As you see on the photos, I simply used a touch sensor on the open condition, and there is no sensing on the closed condition. I used two slip clutch gears in the drive train. I turn the motor on to close the end effector for about 1.5 seconds. This takes the fingers all the way to the closed position when empty, and when there is something to pick up, the combination of the slip clutch gears and the worm gears ensures that I have grasped the object tightly. The turntable is similar to the arm linkages. Again, I was having the problem of overshooting the target angle, so I geared down the rotation sensor. One rotation of 360 degrees is equal to 300 ticks on the sensor. In choosing the placement of the home touch sensors, I was able to find places at the end of travel. Motors 1 and 2 can only go positive, which is very handy when programming. Motor 3 runs the turntable, and it can go negative or positive 360+ degrees in either direction. Programming details. My previous experience with NQC was pretty limited so I had to refresh a lot of my skills and build on them. I have done programming in several environments, so the learning the syntax was pretty easy. During my Lego hiatus, I developed an MS Excel spreadsheet that can identify the angles of the shoulder and elbow linkages given the position of the end effector. It can even identify all the angles the linkage goes through when moving from point A to point B. My initial goal was to program every angle between two positions and cause the movement to be in a straight line. Then when I went to the programming and running environment, I found that I had severely overestimated the capabilities of the RCX brick. I hope the NXT brick allows me to revisit that concept. Relinquishing that hope, my next goal was to run two motors consecutively and stop one when it reached its destination and then the other at its destination. I had some luck, but it was very inconsistent, often missing the target. To fix that, I created detailed code, which worked pretty well, but I found that it sucked up memory very quickly limiting me to less than seven movements. My solution was to turn on both motors and when one reached its destination, stop both motors and then move the remaining distance. This is not desirable, but it works consistently and has somewhat realistic movements. The current program has 37 movements which fills up the RCX, but some are unnecessary and can be removed for different tasks. This was my first experience sending messages back and forth between two bricks and I found that also to be hit and miss. Sometimes the message got through and sometimes not. To ensure the message was received, put the SendMessage command inside a REPEAT loop with a WAIT command. I like to say that I filled the air with messages. One interesting thing that I noticed was that when using the Vision Command camera to record the movements, the IR signal can be seen quite clearly where normally it can’t be seen with the naked eye. I also used functions and subs frequently. This worked very well, my TASK MAIN block contains only commands to run the robot. This makes programming movements very clean and easy. Some typical lines of code look like: Home(); Close(); Move(120, 120, PU_Pos); Move(120, 70, PU_Pos); Move(163, 134, Pos_3); Open(); Overall, programming the robot was challenging, but through smart decision along the way, the final code is very intuitive and easy to understand. Some aspects were very formidable and required 10-15 attempts, but in the end the robot runs mostly consistently and accurately. You may notice at the end of the video, the end effector “claps.” This is my version of showboating, plus I had a little memory left and it only took a couple lines of code. Time for me to go back outside to get some exercise, you should too!