A MacMini Robot
Volume Number: 21 (2005)
Issue Number: 9
Column Tag: Programming
A MacMini Robot
Building A Vision Capable Robot Using A MacMini Computer
by Andrew Turner
Introduction
ADHOC/MacHack
For twenty years, the ADHOC/MacHack Conference has been gathering together groups of developers
and Apple users to push a Macintosh computer to the limits. Along with the standard conference
events of speakers, sessions, banquets, and meetings, there is the infamous Showcase. For 48 hours
straight, Mac developers scurry to bend, twist, and otherwise manipulate the Apple Operating System
and machines to perform unseen before tricks and feats. This may include the recently popularized
MegaMan effect, which recreates stars streaming across your screen when you launch an app,
reminiscent of the video game of the same name, or being able to replace your 'kernel-panic' splash
screen. Whatever, the hack may be it's sure to be a lofty goal and, if the programmer is up to the
challenge, very impressive.
Our goal was to take the Apple hardware outside of its usual desktop location and put it on a
mobile platform and into the world.
Figure 1.
The Goal: a Mac powered robot
George Storm came to ADHOC with a mission in mind. The goal was to develop a Mac-powered robot.
More than just a moving box, this robot would be remotely controllable, and autonomous navigation
with vision capabilities. The result was a small, but impressive display of what modern Apple, and
Mac related hardware can be combined to do.
The bTop Robot is a battery-powered, wheeled chassis centered around an Apple Mac mini computer.
George had built the chassis system at his office in Washington over the past 6-months. Input from
the outside world is obtained via a USB hardware interface board, a firewire iSight camera, and
remote desktop for user driving. Software onboard the Mini allows a user to directly drive the
robot, or activate a vision mode. In vision mode, the robot will use the iSight camera to track
bright green objects. For our showcase demonstration, we wrapped bright green gaffer's tape around
George's ankle. As he walked down the aisle to the center stage, the robot followed along like an
obedient puppy.
The rest of the article explains the various components that comprise the bTop Robot as well as
our experiences with developing software for the robot in our 48-hour development frenzy. Lastly,
some overall experiences and targets for future development on the platform are presented.
Hardware Interface
The bTop Robot, like any physical robot, is a large combination of hardware components which are
interfaced and driven via software in an onboard computer. Each of the sensors are senses that are
fed to the central computer, or brain. Subsequently, the brain drives various motors for moving
itself about in the world.
The Brain
The central processing unit of the bTop Robot is a Mac mini. Using an Apple Mini as the onboard
processor allows for incredible power usually not found in embedded systems. The mini provides a
full on-system development environment, easy network access, and a large array of input/output
options.
To facilitate easier switching of power between a wall-unit and the onboard battery system the
mini's power cable was cut and replaced with a simple BNC type twist lock connector. During long
development, the mini was plugged into wall power and the entire chassis propped up to allow the
powered wheels to spin freely. For testing and operation, the mini's power adapter was plugged into
an onboard battery pack, which lasted up to 4 hours during minimal use and expected 45 minutes
during heavy usage.
While working on the computer, a monitor, keyboard, and mouse were plugged in. This allowed for a
very quick development cycle as the full Xcode compiler and debugger could be run on the targeted
hardware. When testing was performed, the power adapter was switched to the onboard battery and the
remote desktop application Timbuktu was used to operate the computer which had mild user interface
lag and numerous problems with restricted ports on the hotel's wireless network. VNC would also have
been a good choice for a remote desktop and may have presented fewer problems with network ports.
bTop-1 controller board
While the Mini is an amazingly powerful controller for a robot, it lacks decent low-level
hardware device input and output. Historically, hardware developers would use a serial port or
parallel port for easy interfacing with a computer. Modern USB and Firewire ports don't afford such
convenience.
Fortunately, Perfectly Scientific (http://perfsci.com/) has developed a hardware I/O board
specifically targeted to operate on the Mac platform. The bTop-1 board interfaces to the computer
via a USB 2.0 connector and provides the following hardware ports:
- 8 Analog-Digital (A/D) inputs (12-bit, 0~5V)
- 8 Digital-Analog (D/A) outputs (8-bit,
0~2.56V)
- 16 Digital Input/Output ports (0~5V input, 0~3.3V output)
The board connections are rows of double-row header pins. Also, there is a convenient
screw-terminal block that can be used for tying in quick connects from other components. It is a
good idea to connect via wire each of the I/O ports to a separate board with other electronics for
each removal of the bTop-1 board.
The bTop Robot has an electronics tray above the MacMini computer and the bTop board mounted
vertically alongside the mini computer. Connections from the bTop board to the electronics tray are
done using a 16-pin ribbon cable that allowed easy connecting and disconnecting. The channels on the
electronics tray were then split up to the appropriate subsystems.
George Storm has developed a comprehensive Objective-C API that allows for easy interfacing to
the board's ports within Cocoa applications. The API and example applications are available on
Perfectly Scientific's website. The following code snippet illustrates how to initialize the board,
set the output of the digital and analog output ports. It is important to note that the analog
output is 12-bit, and therefore uses decimal values 0~4095 (0~212-1) to corresponding voltages 0~5
Volts.
Figure 2.
// Set our bTop board level observer to this object
[bTopBoard setBTopBoardObserver:self];
// Set direction of Digital I/O port B to all outputs
[bTopBoard setPortBit:'B' direction:1];
// Set A/D ports 0 and 1 to 2 Volts and 5 Volts, respectively
[bTopBoard setADOut:0 value:1638];
[bTopBoard setADOut:1 value:4095];
// Send a refresh for the board to perform all switching
[bTopBoard refreshADPortValues];
[bTopBoard refreshDigitalPortValues];
Reading from the board is performed by instantiating an expected observer function. An observer
is set using the setBTopBoardObserver function illustrated above. The observer function receives the
analog input values as an array parameter which can be iterated through and the values extracted.
Like the analog output values, these input values are encoded at 8-bits, and therefore vary from
0~255 (0~28-1) over 0~2.56 Volts:
- (void)bTop:(BTopBoard *)board receiveADData:(UInt16[8])values
{
UInt8 count;
UInt8 i;
for(count = 0; count < 8; count++)
{
for(i=0;i<NUM_DISTANCE_SENSORS;++i)
{
if(count == m_distanceSensorAddress[i])
m_distanceSensor[i] = values[count] / 255;
}
}
[distanceRadarView setRadarLengths:m_distanceSensor];
}
Overall, the bTop board was an incredibly easy device to use for interfacing the Mac computer
with external sensors. There was little complication on reading and controlling all of our sensors
and actuators from a simple Objective-C API.
During development, we cross-connected several of the analog outputs with the analog inputs and
the same with the digital inputs and outputs. Then as we exercised the virtual motors we observed
the response on the expected sensor ports. After initial debugging, we connected the electronics
tray into the bTop board and ran more tests on the actual robot.
Drivetrain
The bTop Robot is driven by a set of independently powered motors on chain drives to 5-inch
diameter knobbed tire wheels. There is a small third caster wheel mounted on the back of the bTop
Robot for stability.
Each wheel is made reversible by use of a simple switching circuit that is controlled through a
bTop board digital I/O port. After choosing forward or backward, a 0~5 volt signal is sent to the
motor to drive. Experience taught us that the motors need at least 1V of threshold power to begin
turning. Furthermore, when testing we propped the robot on various objects to let the wheels spin
freely. With no pressure on the drivetrain the motors tended to chatter. This chatter didn't
actually occur in driving tests.
Sensors
Our desire was to have a Mac robot with multiple sensory inputs in addition to the very powerful
iSight vision input. On hand we had several IR distance sensors, a line-following sensor, and
accelerometer.
The accelerometer is a sensor that measures the local acceleration in a single axis. This is
useful for obtaining a granular measure of pitch or roll of a body or the take-off acceleration of a
moving vehicle. The bTop Robot is fairly slow in speed and low to the ground and exhibits minimal
body pitch or roll and small accelerations. We therefore chose to not use the accelerometer onboard.
The distance sensors have dual LEDs in a single package. One LED sends out an infrared signal
that is then measured by the other LED from reflected surfaces in front of the sensor. The output is
a voltage which corresponds to the distance to the reflecting object. The bTop Robot has 6 of these
sensor, 3 on the front bumper spaced 30 degrees apart, and 3 on the rear bumper spaced 30 degrees
apart.
Lastly, we were also hoping to have bTop perform some line-following maneuvers. Using sensors
similar to the distance sensors the LEDs would track on either side of a line. If the robot, and by
consequence the sensors, crossed the line the sensor would indicate this. The robot's logic would
then be able to turn according to the activated LED and get the robot back on line.
Due to time constraints of the 48-hour hack marathon before the final showcase, we were unable to
implement this functionality. The hardware exists on the chassis and only awaits integration with
the software to assist in driving the bTop Robot.
Figure 3.
Vision
Our primary, and lofty, goal of a vision controlled robot was realized through the power of a
full onboard computer and firewire camera. The result was a vision system that could identify,
locate, and communicate the position of brightly colored objects. Due to available materials, we
chose a neon green gaffer's tape as our target color for the vision system to track.
Looking for colors
Each year the ADHOC/MacHack showcase demonstrates some great techniques for performing tasks on a
Mac. Sometimes the original developer or new developers build upon previous years' inspiration and
codebase. Last year, Lisa Lippincott developed a hack called 'ScrollPlate', which was designed to
compensate for the lack of a scroll wheel on a powerbook laptop. Using the iSight camera and a paper
plate with a red up-arrow on one side, and a green down-arrow on the other side, the program would
control a window's scroll bar. When the use showed the red up-arrow, the window would scroll up.
Consequently, when the user showed the iSight camera the green down-arrow, the window would scroll
down. Another ingenious, but mostly pointless hack, well done.
ScrollPlate's source code already solved many of the problems we would have integrating vision
onboard the bTop Robot. The largest hurdle, we found out, was integrating the processor intensive
Carbon code into the Cocoa robot controller application and still leave the computer enough
processing time to drive the motors.
ScrollPlate itself was built upon modifications to an Apple demonstration application that would
identify lobsters on the iSight camera. Our first modification was to identify the color we wanted
to track. Using a photograph of the gaffers tape and an eye-dropper tool we obtained RGB values for
the specific green. Then, we converted these values to YUV, which is an easier parameter set in
which to allow for ranges corresponding to light and dark variations of the green tape depending on
external lighting.
Telling where the color is
After we determined the color, and range of variations in color, we were looking for, we had to
determine where this color was in the frame, and then where this color was in the world.
After some 'precise' measurements, via large pieces of paper and a ruler, we determined the
iSight had a field of view of 33-inches wide at 42-inches away. A conversion function allowed us to
convert a location in the pixel coordinates of the vision frame to an approximate angle (radians) in
the real world.
static double PixelToAngle(int pixel, int pixelFieldWidth)
{
/* We measured 33 inches wide at 42 inches away */
double fieldWidth = 33.0 / 42.0;
return atan( ( (double)pixel - pixelFieldWidth/2.0 ) / pixelFieldWidth * fieldWidth );
}
After the pixel center and bounding box was determined in angles, these values were sent to the
robot controller, which determined appropriate driving behavior.
There was some difficulty with the vision system when there was insufficient lighting on the
tracked object, or when various other green objects showed up in the camera's sight. For example,
the robot once began to veer wildly away in testing. After some investigation we found the leftover
roll of gaffers tape left on a box about 15 feet away that the robot began heading towards.
Semi-Autonomous Control System
User-controlled driving
For testing and general mayhem, we implemented a driver-in-the-loop controller for allowing
remote operators to drive the bTop Robot around. The robot is like a tank, with two parallel wheels
that operate in tandem to drive forward, turn, or reverse the chassis. In order to make driving easy
and natural for an untrained operator, a simple point-and-click interface was designed.
Since the project was under a time-crunch, and for easy development, an onboard user interface
was developed that was controlled through the Timbuktu remote desktop. A custom interface for
controls and sensors was developed using the CustomView box in Interface Builder and constructing
the interface using Quartz commands. We decided building up a custom interface was a better solution
that attempting to modify a common control to display information meaningfully.
Figure 4.
A user clicks and drags in the steering circle to operate the robot. Dragging forward and to the
left or right drives the robot forward with simple turning by slowing down the wheel on the inside
of the turn. For example, to make a left turn, the right wheel is slowed down. The same effect
happens when the user drags down in the steering circle, but in reverse.
Since the robot has two wheels mounted almost across the center of gravity of the robot, spin
maneuvers are possible. By clicking and dragging horizontally to the left or right of the steering
circle the robot will perform an in-place spin maneuver.
For safety, and ease of operation, when the user releases the mouse button the robot is sent a
stop command and the steering circle is reset. One easter egg related to development did leave a
type of 'cruise-control'. If the user clicked, and dragged within the steering circle and kept
dragging the mouse point out of the circle and then releasing, the robot would be sent the same
command and continue one its way. By simply single-clicking within the steering circle the robot
would be sent a stop command and come to rest.
The user interface also displayed the output of the front and rear distance sensors to the robot.
The triangle bars correspond to distance and change from green, to yellow, to red as the obstructive
object gets closer to the robot.
Vision-controlled driving
The button on the bottom left of the user interface invokes the vision control. Due to the high
processing power, limit of the frame rate of the desktop through Timbuktu and perhaps other
elements, it was not possible to control the robot via the driving interface once the vision
interface was invoked.
Figure 5.
The maximum bound of the green object as well as the average center of the bounds were then sent
to the robot controller. Careful testing, performed sometime in the middle of the night, showed that
the acceptable range of object width was up to 0.3 radians, or 17.2 degrees. Therefore, when the
object took up more than this width of the image, then the robot would stop. In between a 0.3 radian
width and down to a 0.05 radian width, the robot scaled its velocity linearly.
Robot heading was determined by the variation of the center of the target color. Between +/- 0.3
radians, the robot scaled its heading linearly as a percentage of the maximum range. Beyond the 0.3
radians width, when the object was either on the far left or right of the image, or even out of
range, then the robot would stop and spin to face the target better.
After a heading and velocity were determined from the vision system, the commands were sent to
the same low-level controller used by the human-based driving.
Robot in Motion
The ADHOC/MacHack showcase begins 48-hours after the conference opens, which is at midnight on
Friday night. While showcase hacks should be complete before the beginning of the showcase,
technically the hack isn't due until right before the end of the showcase. This ending time varies
depending on the number of showcase entries.
Like any good software project we utilized all the time available to us, including the time well
past the original deadline. Our development team was still writing and debugging the bTop Robot
during the showcase presentations. In fact, the last bug of our software was due to a behavior of
the Objective-C compiler, which causes only a warning, but not an error, when a C++ header file is
missing and unknown C-functions are called. With several minutes to go this last bug was figured out
and the bTop Robot finally tested.
As the last hack to demonstrate of the night the stage was set for our entrance. We dimmed all
the lights, which also helped the vision system, and shown a single flashlight on the George Storm's
ankle, where the gaffer tape was fastened, as he walked down the aisle. Scooting along behind, the
bTop Robot made it's way down the aisle to the center stage.
The robot system performed excellently down a very complex row of chairs, bags, and other
objects. It was even able to traverse several cables that lay across the aisle. The following night,
at the awards banquet, the bTop Robot received 3rd place in the open-vote judging, earning its
developers, and the Mac mobile platform, well deserved honors.
Future Work
The ADHOC/MacHack project putting together the bTop Robot was a terrific experience. Having a
group of knowledgeable developers and technicians together willing to assist wherever possible was a
great boon to putting the system together in such a short timeframe. However, due to the limiting
time, there were many features that were left out and many corners that were cut in developing the
vision and control system.
For the future, we would like to clean up the code that controls the robot and move the user
interface to a remote computer running as a client connecting to the Mac mini as a server. This
would limit the need of a remote desktop tool like Timbuktu or VNC for operating the robot.
There were several sensor systems that were not fully utilized such as the distance sensors and
line-followers. Integrating these into the robot with some safety mechanisms ("Don't run into any
objects"), as well as advanced control scenarios ("Follow this line", "Track along the wall") would
be desired. There are also many other sensors that would easily integrate onto the chassis and
connect through the bTop board. Perhaps implementing a plug-in architecture to the software would
allow for easy adding and removing of hardware interface components.
Lastly, while the ScrollPlate software performed adequately for the hack demonstration, the frame
rate was limited to 5FPS and modifications are difficult. It may be possible to implement a vision
system using a Quartz Composer component in the cocoa application, or CoreImage to create a better
solution. The bTop Robot could track different colors, light and dark areas, or individually shaped
objects.
Resources
ADHOC/MacHack Conference: http://adhocconference.com
Perfectly Scientific - makers of the bTop-1 hardware board: http://perfsci.com
bTop Robot Yahoo! Group: http://groups.yahoo.com/group/bTop_Robot
Andrew Turner is a Systems Development Engineer with Realtime Technologies, Inc (http://www.simcreator.com) and has built robotic airships,
automated his house, designed spacecraft, and in general looks for any excuse to hack together cool
technology. You can read more about his projects at http://www.highearthorbit.com.
George Storm has been a Macintosh developer since six months after its release, primarily working
in frameworks and GUI. Prior to his programming experience he worked for more than ten years in
electronic R&D in a wide variety of disciplines and holds three patents for mechanical devices
related to handicapped access.
Lisa Lippincott designs hot new internet technology, even now that it's unfashionable. Since 1996
she's been designing it at BigFix, finding ways to deliver information about computer problems to
people who have problematic computers.