Bender (2012)

The BananaSlug Engineered Nuptial Drink Emitting Robot was designed and built when a second member of my senior design group got married in 2012.

Bender is a drink mixing robot that can hold up to 8 liquids at a time and requires a video to be recorded for the bride and groom before it will let you have your drink. An Android tablet drives an ADK board which controlls a drink arm and 8 solenoids to select and dispense your chosen mix. The interface allowed the set of drinks to be configured, pricing to be included with drinks, and the video to be disabled. The videos were all stored locally on the tablet and synced to a dropbox account so they could be accessed while the robot was operating. The body of the robot is made mostly of acrylic with some mdf components. Beside the tablet and ADK a custom board was produced to drive the motor, solenoids, and LED lights around the robot which were programmed to blink in various patterns at different parts of the drink making process.

More information can be found at the full page here
 Bender robot sitting idle
JubJub (2009)

For my master's research at CMU I designed a visual programming environment for CS education called JubJub. I conducted interviews with researchers on visual programming languages and education to develop a set of requirements and goals which I then applied to develop the new system. JubJub is designed to be a transitional language for students in 6th through 12th grade. It supports integration with a large amount of educational material by providing an API for implementing external libraries as code blocks. This allows robotics, graphics, sound, and other materials to be used. JubJub is also designed to provide an increasing level of interaction with a textual language and to support many features of common commercial IDEs. JubJub is a New BSD open source project and is free to use and distribute.

The source code for the architecture and a prototype interface can be found here. Instructions for downloading and using a runnable jar of the current build can be found on its wiki page, here.

My final presentation can be found here.
A PDF copy of my thesis can be found here.
The Finch (2008-2009)

The Finch is a robot designed for use in CS education by the CREATE Lab at CMU. Its two goals are to be low cost and highly interactive. To that end, it is USB tethered, removing the need for batteries, and contains an array of sensors and outputs. The Finch is programmed in Java, which uses a wrapper interface to provide complete control of the Finch in an environment already widely used in intro to CS classes. So far, 100 robots have been built, around half of which are being piloted in a freshman level intro to CS class. Around 40 Finches have also been distributed to high schools with CS classes.

During the school year my work with the Finch focused on writing example programs for the classroom in Java, providing updates to the firmware in C, and meeting weekly to discuss further development of the system. Several of my examples are included in the educational release on the Finch's download page. The Finch is also a large part of my Master's work, the goal of which is to develop an iconic programming language that visually resembles and compiles to Java in order to teach CS concepts that will more easily carry over to procedural languages in use today. The initial work is aimed at developing the language enough to be able to program the Finch.
Cephalopod Inspired Camouflage (2008)

This project was for a Bio-Inspired Robotics class I took in Fall 2008. The goal was to use some visual feedback to create an adaptive camouflage system and was inspired by the amazing abilities of cephalopods and especially octopi. One of my favorite videos from the background research for this project can be seen

The overall goal was to create a system using a display and a camera aimed so it could see both the display and the background. The system would then generate an image to imitate either the color or pattern of the background and display it on the screen. It would then use iterative pictures to adjust the display until the two matched. This process had varying effectiveness depending on color and pattern but showed the technique was feasible.

This was a paired project.
 The software was written in MatLab.
Big Flower Girl (2008)

The Big Flower Girl (BFG) was created in response to news of our Senior Design leader's impending marriage--this wasn't actually a surprise, but it gave us an excuse to do something fun. Three of us decided that the best (and most fun) gift we could give her was to build her a robotic flower girl for her wedding. We set to work on weekends and off hours when we weren't busy with classes and soon had a Solidworks model and concept drawings put together. A quick trip to Halted and a set of fans were blowing test paper around the room. From there we progressed steadily, hacking an RC car for its remote, designing and integrating a motor controller board, and borrowing a few Bodine motors from the box in the lab.

We rushed to get it finished a week early so that the groom's mother could add the finishing touch, a flower girl dress. Tested, filled with flower petals, and loaded with two freshly charged 10 pound batteries we set up, ready for the ceremony, each of us hoping nothing would short, burn out, snap, clog, or burst into flames. We had been carefully instructed on our timing and knew who were coming before us and when the robot should be moved into the aisle, so we waited and watched the procession. The time came and the robot was quickly pushed out into the isle and switched on. I directed it down the aisle using the RC remote salvaged during construction and behind the robot drifted a light scattering of rose petals. The robot performed perfectly and rolled off to the side at the end of the aisle where it was quickly turned off and placed out of the way. We all relaxed a little bit and watched the rest of the ceremony, happy for both the robot and our friend.

The full PCB design for the power board is included in the attachments here.
Collaborative Team-Diagnosis and Repair in Multi-Robot Systems (2007)

I worked on this during my internship at the NASA Robotics Academy. The previous summer a system had been developed for using a camera and tracking dots to determine the fault probabilities for a single robot. Our task was to assume a given fault in a single agent have the other agents of a team coordinate the repair of the damaged agent. This was a team project with three undergraduates and one graduate student. Over the summer we redesigned the tracking process to allow tracking of multiple agents in real time, built four robotic agents, developed an autonomous system for decision making and path planning in the group, and integrated the systems to successfully demonstrate removal of a damaged module.

I was in charge of the embedded code which had to autonomously determine who would perform a repair and plan a path to remove a damaged module. The final presentation is included in the attachments here.

A video of the removal can be seen below:
Swarm of Mapping Automatons - SOMA (2007)

I took the Senior Design course at UCSC in my third year, mostly because I'd been taking classes with the people a year ahead of me and it would be my last chance to work with many of them. Six of us formed a team with a lot of ambition and a fair amount of crazy and decided to build a platform for swarm research.  Over the course of 20 weeks we designed and built four robots from the ground up. The robots used an iRobot Create© as the base, which gave us wheels and bump sensors. On top of the Create we added several layers of acrylic and circuit boards. This added to each robot RF communication, ultrasonic ping sensors, and a system for localization between the robots using time of flight measurements between IR and ultrasonic signals. The robots would take turns moving through a space and mapping it out with their ping sensors. The data was transmitted between the robots and to a PC which would build a map from the data collected. Each robot acted fully autonomously without any central control.

Full details can be found on the project website for
 The SOMA Project.

Another source is our
 Instructables page, full of less technical details. 
Mechatronics (2006)

In my second year at UCSC I took a class which put me definitively onto the robotics path. The class was CMPE118, or Mechatronics. We spent six weeks discussing sensors, sheer forces, state machines, motor specs, and everything else that goes into building a simple robot. With less than four weeks left in the quarter we were given the final project, a robotic duel. The official title was The Good, The Bad, and The Slugly. The goal was for two robots to start back to back on a playing field, drive to opposite ends, turn, and attempt to shoot a pop can off the other robot's head. In this class I met four of my future SOMA teammates. My partner Cathy and I are on the left with our robot, the Megadoomer.

A fairly inaccurate news clip on the event can be found here.

Erik Pasternak,
Apr 29, 2009, 11:29 AM
Erik Pasternak,
Apr 27, 2009, 9:47 PM
Erik Pasternak,
Aug 26, 2009, 7:51 AM
Erik Pasternak,
Aug 26, 2009, 7:51 AM