Here’s a video I’ve wanted to post for a long time. It was a project for a college class in 1994, taught by one of MIT’s most understanding professors, Donald Troxel. Carlton Mills digitized the old VHS copy of our video report.
It was built by a team of three students: John Wallberg, Adam Holt, and me—we were the leftovers, actually, who hadn’t already found lab partners. Our two project ideas were this and robotic air hockey, and we were pretty sure a sheepdog would be both easier and safer. But we still finished it about 10 days late, working right up until we all had to leave for winter break. But it still won a prize (follow the link and search for “Newton”)!
The sheepdog itself is a LEGO robot covered by a cardboard box, and the sheep is a jittery baby toy. The sheepdog controls are three hand-built computers running 5-volt TTL logic, 8-bit 10MHz microcontrollers, miscellaneous op-amps and potentiometers, and at best 256 bytes of RAM and a few kilobytes of EEPROM. It turns out you can do some basic vision, navigation, and motor control with that!
Some details:
- The sheep had to be juiced down by power engineer Adam — I think he used 3 good batteries and a dud instead of 4 good batteries, because it was just too vigorous otherwise.
- The navigation system required the most complex computer of the three; Adam had 16 bits to work with on his main bus, which he had to divide into address and data lines; I think he settled on 10 bits for addressing and 6 bits of numerical precision in his trigonometric lookup tables. That way he could have 1024 instructions, of which he probably used 1023. While the vision system was being debugged, Adam and John had to find creative ways to test the navigation and driving blind. Because of their preparation, the whole thing basically worked the first night we plugged in the vision sensor. I think it was only possible because of Adam’s love of trigonometry and navigation. In his spare time he was downloading publicly available street grid data and displaying it interactively in ASCII on his home computer — a precursor to modern web-accessible maps.
- The vision system did simple light-dark thresholding and looked for blobs. We didn’t have a wide-angle lens, so we jammed our camera up as high in the ceiling as we could, to get about an 8-foot square field of view. My favorite feature was the debugging display, visible at 1:42, which was an RGB screen hooked straight to various internal signal lines of the vision system. It showed a raw greyscale camera channel in green, thresholded blobs in red, and the center of one of the blobs in blue (but aliased into a grid and with computation noise mixed in because I ran out of wiring space). You could see it compute the blobs at the top of each frame and then settle out into a clean debugging pattern that followed a blob around. It’s briefly visible in the background.
- The mechanical robot was amazingly reliable. John built it out of proven 6.270 parts, including what must have been at least 3 pounds of cross-linked LEGO Technic pieces, two small DC motors, rechargeable lead-acid motorcycle batteries (housed in the chassis itself) and a pulse-width modulation circuit. After we tweaked constants to match slight difference between the two drivetrains, the robot could go straight or curve along fairly accurate arcs. But even if it wasn’t exact, the feedback loop from sensor to navigation planning was tight enough (30 hertz?) that corrections were pretty immediate. At about 4:00, for example, it navigates tightly around the sheep and butts it from the other side. (Another, less fortunate team the same semester used a different brand of toys for their robot, along with stepper motors, which are theoretically simpler to control digitally but proved to be much less mechanically robust. Their robot hardware never did function properly, but the teachers were understanding and graded them well based on their circuits and software.)
What a creative project! What were the objectives of the assignment?
Thanks Aubrey! The assignment was open-ended, although we were supposed to incorporate the elements we had practiced during the course. We built up to it by spending the first two-thirds of the semester on four other projects that were simpler more fully specified.
For example, one of them was a traffic light controller with a “traffic sensor” (really, just a toggle switch) that sent it into a 30-second cycle to allow traffic in from a rarely-used side road.
I don’t remember the details of the other four projects, but some of the things the earlier projects taught were:
Those sound really primitive, and they were! What the course taught was how to implement them physically, at the level of wires and chips. It had a reputation for being pretty tough — its nickname was “digital death lab” among the undergrads ^_^