For my master's research at CMU I designed a visual programming environment for CS education called JubJub. I conducted interviews with researchers on visual programming languages and education to develop a set of requirements and goals which I then applied to develop the new system. JubJub is designed to be a transitional language for students in 6th through 12th grade. It supports integration with a large amount of educational material by providing an API for implementing external libraries as code blocks. This allows robotics, graphics, sound, and other materials to be used. JubJub is also designed to provide an increasing level of interaction with a textual language and to support many features of common commercial IDEs. JubJub is a New BSD open source project and is free to use and distribute.
The Finch is a robot designed for use in CS education by the CREATE Lab at CMU. Its two goals are to be low cost and highly interactive. To that end, it is USB tethered, removing the need for batteries, and contains an array of sensors and outputs. The Finch is programmed in Java, which uses a wrapper interface to provide complete control of the Finch in an environment already widely used in intro to CS classes. So far, 100 robots have been built, around half of which are being piloted in a freshman level intro to CS class. Around 40 Finches have also been distributed to high schools with CS classes.
This project was for a Bio-Inspired Robotics class I took in Fall 2008. The goal was to use some visual feedback to create an adaptive camouflage system and was inspired by the amazing abilities of cephalopods and especially octopi. One of my favorite videos from the background research for this project can be seen here. The overall goal was to create a system using a display and a camera aimed so it could see both the display and the background. The system would then generate an image to imitate either the color or pattern of the background and display it on the screen. It would then use iterative pictures to adjust the display until the two matched. This process had varying effectiveness depending on color and pattern but showed the technique was feasible.
This was a paired project and I primarily worked on the software, which was written entirely in MatLab.
The Big Flower Girl (BFG) was created in response to news of our Senior Design leader's impending marriage--this wasn't actually a surprise, but it gave us an excuse to do something fun. Three of us decided that the best (and most fun) gift we could give her was to build her a robotic flower girl for her wedding. We set to work on weekends and off hours when we weren't busy with classes and soon had a Solidworks model and concept drawings put together. A quick trip to Halted and a set of fans were blowing test paper around the room. From there we progressed steadily, hacking an RC car for its remote, designing and integrating a motor controller board, and borrowing a few Bodine motors from the box in the lab. We rushed to get it finished a week early so that the groom's mother could add the finishing touch, a flower girl dress. Tested, filled with flower petals, and loaded with two freshly charged 10 pound batteries we set up, ready for the ceremony, each of us hoping nothing would short, burn out, snap, clog, or burst into flames. We had been carefully instructed on our timing and knew who were coming before us and when the robot should be moved into the aisle, so we waited and watched the procession. The time came and the robot was quickly pushed out into the isle and switched on. I directed it down the aisle using the RC remote salvaged during construction and behind the robot drifted a light scattering of rose petals. The robot performed perfectly and rolled off to the side at the end of the aisle where it was quickly turned off and placed out of the way. We all relaxed a little bit and watched the rest of the ceremony, happy for both the robot and our friend.
I worked on this during my internship at the NASA Robotics Academy. The previous summer a system had been developed for using a camera and tracking dots to determine the fault probabilities for a single robot. Our task was to assume a given fault in a single agent have the other agents of a team coordinate the repair of the damaged agent. This was a team project with three undergraduates and one graduate student. Over the summer we redesigned the tracking process to allow tracking of multiple agents in real time, built four robotic agents, developed an autonomous system for decision making and path planning in the group, and integrated the systems to successfully demonstrate removal of a damaged module.
I took the Senior Design course at UCSC in my third year, mostly because I'd been taking classes with the people a year ahead of me and it would be my last chance to work with many of them. Six of us formed a team with a lot of ambition and a fair amount of crazy and decided to build a platform for swarm research. Over the course of 20 weeks we designed and built four robots from the ground up. The robots used an iRobot Create© as the base, which gave us wheels and bump sensors. On top of the Create we added several layers of acrylic and circuit boards. This added to each robot RF communication, ultrasonic ping sensors, and a system for localization between the robots using time of flight measurements between IR and ultrasonic signals. The robots would take turns moving through a space and mapping it out with their ping sensors. The data was transmitted between the robots and to a PC which would build a map from the data collected. Each robot acted fully autonomously without any central control.
Full details can be found on the project website for The SOMA Project.
Another source is our Instructables page, full of less technical details.
In my second year at UCSC I took a class which put me definitively onto the robotics path. The class was CMPE118, or Mechatronics. We spent six weeks discussing sensors, sheer forces, state machines, motor specs, and everything else that goes into building a simple robot. With less than four weeks left in the quarter we were given the final project, a robotic duel. The official title was The Good, The Bad, and The Slugly. The goal was for two robots to start back to back on a playing field, drive to opposite ends, turn, and attempt to shoot a pop can off the other robot's head. In this class I met four of my future SOMA teammates. My partner Cathy and I are on the left with our robot, the Megadoomer.
A fairly inaccurate news clip on the event can be found here.