Dusty: A Low-cost Teleoperated Robot that Retrieves Objects from the Floor

People with motor impairments have consistently ranked the retrieval of dropped objects from the floor to be a high priority task for assitive robot. Motor impairments can both increase the chances that an individual will drop an object and make recovery of the object difficult or impossible. In a study we conducted, motor impaired patients reported dropping objects an average of 5.5 times a day. The presence of a caregiver led to a recovery time of approximately 5 minutes, while the absence of a caregiver could delay the recovery as long as two hours. To meet this need, we have been developing an inexpensive teleoperated robot that is able to effectively retrieve dropped objects since 2008. We call this robot “Dusty”.

Dusty II


Dusty now is in its second generation. For convenience, we are still using iRobot Create as its mobile platform. Its new end-effector is able to pick up almost any household object under one pound. Dusty is controlled by a wheelchair joystick. User can easily fetch an object from the floor by driving the robot to a position roughly in front of the object and pressing a button. Dusty will autonomously move forward and grasp the object into its end-effector. Then the user can navigate the robot back, press the lift button, and Dusty will lift the object to a comfortable height for the user.

We are currently doing user study with ALS patients. The results look very exciting, and we are going to release the data very soon.

Dusty I

Robot Dusty was initially inspired by dust pan and kitchen turner, that’s why we name it “Dusty”. Although Dusty I was not an functioning robot prototype, but we were able to test the robot’s noval end-effector. This end-effector, which was later modified and used in Dusty II, was already able to grasp 34 objects from the prioritized lists[*] on 4 types of floors, with a success rate of 94.7% (The new end-effector of Dusty II  has a success rate over 97%, in a much larger area).




# Publish ADC data over ROS.
# One example of visualizing the data:
# rxplot /arduino/ADC/data[0] -b 15 -r100 -p5

import serial

import roslib; roslib.load_manifest('hrl_fabric_based_tactile_sensor')
import rospy
from hrl_msgs.msg import FloatArray

def setup_serial(dev_name, baudrate):
        serial_dev = serial.Serial(dev_name, timeout=1.)
        if(serial_dev == None):
            raise RuntimeError("robotis_servo: Serial port not found!\n")


        return serial_dev

    except (serial.serialutil.SerialException), e:
        raise RuntimeError("robotis_servo: Serial port not found!\n")

def get_adc_data(serial_dev, num_adc_inputs):
    ln = serial_dev.readline()
    l = map(int, ln.split(','))
    if len(l) != num_adc_inputs:
        rospy.logwarn('Something fishy with the serial port.')
        l = get_adc_data(serial_dev, num_adc_inputs)
    return l

if __name__ == '__main__':

    dev_name = '/dev/ttyUSB1'
    #dev_name = '/dev/ttyACM0'

    baudrate = 115200

    serial_dev = setup_serial(dev_name, baudrate)

    pub = rospy.Publisher('/arduino/ADC', FloatArray)

    for i in range(10):
        ln = serial_dev.readline()

    rospy.loginfo('Started publishing ADC data')

    while not rospy.is_shutdown():
        fa = FloatArray()
        fa.header.stamp = = get_adc_data(serial_dev, 6)



  Analog serial reader
  by Tom Igoe
  Language: Arduino/Wiring
  Reads several analog inputs and sends their values out.
  This application assumes you have analog sensors attached
  to analog inouts 0 through 5. At the simplest, you can 
  hook up five potentiometers and it'll work fine.
  Created 24 May 2006
  Modified by the Healthcare Robotics Lab, Georgia Tech
  11 July 2012
// define the total number of analog sensors
// that you want to read
#define numberOfSensors 16
void setup() {
  // initialize the serial port:

void loop() {
  // loop over the sensors:
  for (int thisSensor = 0; thisSensor < numberOfSensors; thisSensor++) {
    // read each sensor
    int sensorReading = analogRead(thisSensor);
    // print its value out as an ASCII numeric string
    Serial.print(sensorReading, DEC);
    // if this isn't the last sensor to read,
    // then print a comma after it
    if (thisSensor < numberOfSensors -1) {

Manipulation with Whole-Arm Tactile Sensing

We are attempting to create a new foundation for robot manipulation that encourages contact between the robot’s arm and the world. A key aspect of our approach is that we assume the robot has compliant joints and whole-arm tactile sensing. To help promote the development of this approach, we have released the following open source code to control robot arms and open hardware that can cover robot arms with touch sensors. The code and hardware are products of our research funded by DARPA and the National Science Foundation (NSF).

Open Source Software: Control Robot Arms with Whole-Arm Tactile Sensing

Our open source software includes code to control robot arms to reach in clutter while keeping contact forces low, as described in our 2013 IJRR paper, “Reaching in Clutter with Whole-Arm Tactile Sensing”. It also includes code to interface with tactile sensor arrays and visualize their output, including all the code required to use our open hardware tactile sensors that we describe below. The code can be found via the following link: open source code to control robot arms with whole-arm tactile sensing

Documentation for the code can be found via the following link: documentation for the code

Open Hardware: Stretchable Fabric-Based Tactile Sensing

Our approach assumes that the robot arm has compliant joints and whole-arm tactile sensing (robot skin). Robots with compliant joints have become common in research and commercial products exist. Yet, relative to compliant joints, whole-arm tactile sensing for robots is rare and the technology is less mature. To support our research, we have developed tactile sensor arrays that are easy to fabricate, easy to use, and inexpensive relative to other large tactile sensing arrays. This has enabled us to quickly prototype tactile sensor arrays and give multiple robots whole-arm tactile sensing.Our main focus is not tactile sensor development and our sensors have limitations. However, we have found that they are sufficient to support our research and enable robots to perform useful tasks, such as reaching near the body of a user. We have released our sensors as open hardware with the intention of supporting researchers and hobbyists, although anyone is welcome to use our designs. We hope that the sensors and accompanying software will make it easier for people to build on our research.If you use our open-hardware tactile sensor design or wish to refer to it, please cite the following paper:

The main contributors to the development of this hardware were Dr. Advait Jain, Sarvagya Vaish, Jeff Hawke, and Prof. Charles C. Kemp. Dr. Advait Jain was the lead student for the first versions of the hardware for Cody and the PR2. Sarvagya Vaish was also a student contributor to the first versions. Sarvagya Vaish and Jeff Hawke were the main student developers of the advanced tactile array.

1. Basic Principle

A single sensing element or taxel (short for tactile pixel) consists of five layers of fabric. The layer in the middle is resistive fabric, which is sandwiched between two layers of conductive fabric. The taxel exhibits piezo-resistive properties. Specifically, the resistance between the two electrodes made of conductive fabric decreases when we apply a force on the sensor.


We use this taxel as part of a resistive voltage divider that we connect to the analog input pin of an Arduino board.


2. Steps to Make a Single Taxel

Having a picture or wiring diagram associated with each step would be geat.

  • Stick a patch of iron-on adhesive to the conductive fabricheat_bond
  • Cut two pieces of conductive fabric to serve as the two electrodes; one for the ground plane and the other for sensing area of the taxelcond_fabric_pieces
  • Stick the conductive fabric to the non-conductive fabric using iron-on adhesive


  • Sew the edges using a zig-zag stitch. The glue will no longer hold the conductive fabric in place when the sensor stretches, but makes sewing a lot easier. sewing
  • both_pieces_after_sewing
  • Add female snap buttons to the two layers of conductive fabric. Connect wires to the conductive fabric using male snap buttonsfemale_snap_buttonssnap_buttons
  • Put a layer of resistive fabric between the conductive fabric


  • Wire up an Arduino as shown in the schematic below and upload this code to it.schematic
  • Run this python script after starting roscore. Visualize using rxplot with the following command:
rxplot /arduino/ADC/data[0] -b 15 -r100 -p5

3. Making a Tactile Sensor Array

To make an array of tactile sensors, we change one of the layers of conductive fabric to have multiple discrete electrodes of the desired shapes and sizes. For example, 15 taxels that are rectangles of size 3cm x 5cm.conductive_fabric_electrodes

The second conductive fabric serves as a common ground for all the taxels.common_ground

We also use a single sheet of resistive fabric on top of the common ground.resistive_layer

3.1 Simple Tactile Sensing Array

To create a simple array of tactile sensors, we connect the taxels to snap buttons using conductive thread (as shown in the picture below), or put the snap buttons directly on the taxel and use normal wires (method that we used for the PR2). We then connect them to an ADC port on an Arduino board.snap_and_arduino

3.2 Advanced Tactile Sensing Array

To create an advanced array of tactile sensors, we attach snap buttons on the taxels, but instead of connecting them to an ADC port on an Arduino board, they are connected to an 8-channel ADC chip mounted on the skin. The I2C communication protocol is used to connect the ADC chips to the Arduino. Up to four ADC chips can be daisy-chained and then a single set of Power, Ground, SCL, and SDL wires connect the ADC to the arduino. This helps reduce the wiring from the sleeves to the Arduinos.pr2_sleeve_wiring

The ADC chip, bought from Sparkfun, is mounted on a custom PCB board. The PCB board also houses in-line pullup resistors and connectors. The entire PCB board is stitched on to the sleeve to prevent it from moving, as shown below. Since each ADC chip has 8-channels, eight wires branch out from the PCB, connecting to the snap buttons on eight different taxels.pr2_sleeve_electronics

4. Cody Specific Design

4.1 Simple Taxel Layout

To cover Cody’s arm with our tactile sensors, we used one sleeve with 12 taxels in the forearm and 12 taxels in the end-effector region.cody_whole_arm_sleevecody_taxels_only

Design Files

Click here for all SolidWorks files.
The following are the images showing the taxel arrangement for Cody’s sleeve.2 part sleeve - outer cutout2 part sleeve - taxel layout










5. PR2 Specific Design

5.1 Simple Taxel Layout

To cover the PR2 arm with our tactile sensors, we have different designs for the gripper (10 taxels), forearm (14 taxels), and upper arm (3 taxels).



Design Files
  • The picture on top shows how the taxels are arranged
  • The picture in the bottom shows how the cutouts for the ground plane, resistive fabric and outer and inner coverings are arranged.

Click here for all SolidWorks files




Click here for all SolidWorks files



Upper Arm

Click here for all SolidWorks files


 5.2 Advanced Taxel Layout

For the advanced taxel layout, the number of taxels on the gripper and the upper arm are not changed. The number of taxels on the forearm were increased to 22 taxels. This was done by adding two taxels on each of the four edges, thus adding a total of eight taxels.



Forearm Design Files

Click here for all SolidWorks files.
The following are the images showing the taxel arrangement for different sides of the forearm.








6. Raw Materials and Equipment

Here are the places where we purchased the different raw materials and equipment:

8. Support

This work was support by DARPA Maximum Mobility and Manipulation (M3) Contract W911NF-11-1-603 and National Science Foundation (NSF) Award IIS-1150157.

9. Acknowledgements

Our design is inspired by the rSkin project. We thank Hannah Perner-Wilson and Ian Danforth for helpful discussions.
We also thank Matt Carney and Aaron Edsinger from Meka Robotics for pointing us to the rSkin project.

10. Useful Links

Assistive Mobile Manipulation for Older Adults at Home


The Results Are In : PR2 Beta Program Recipients!

(Willow Garage Announcement)


There is a growing need in society to enable older adults to remain in an independent living environment. Many older adults fear losing their independence and being required to move to an assisted living facility. From a societal perspective, it is cost-effective to support older adults’ preference to age in place. The economic implications of transitioning to full-time residential care settings are substantial, both to individuals and to society. Given current demographics, these costs are projected to increase exponentially. Older adults living in their own homes may be faced with situations in which there is a mismatch between the demands of their daily environment and their capabilities. These situations generally result from both increased demands (e.g., use of new medical devices) and deficits in the capabilities of the individual (e.g., age-related changes in cognition, perception, or movement control). There is great potential for robotics to support the needs of older adults — either directly or by supporting the activities of professional caregivers (e.g., nurses or physical therapists) who work in the homes of older adults.


The proposed research will consist of two closely integrated thrusts: one devoted to human-robot interaction and the other focused on software development. Both thrusts will be directed toward the development of assistive capabilities for the PR2 robot, with an emphasis on home care for older adults. These two thrusts are highly synergistic. The human-robot interaction (HRI) thrust will help ensure that the software development is closely connected to real-world needs, and the software development thrust will provide capabilities that both inform and enable cutting-edge studies of human-robot interaction.

An Interdisciplinary Team

Prof. Charlie Kemp, a leading researcher in the area of assistive mobile manipulation, will serve as the PI for the overall project and thrust leader for software development. Prof. Wendy Rogers, who is a leading researcher on technology and human factors for older adults, will be the thrust leader for human-robot interaction. They will lead a cross-disciplinary team of 17 researchers. Each of the four Co-PIs brings important expertise to the project. Prof. James M. Rehg is a leading expert in computer vision and machine learning, and is the Associate Director of Research in the Center for Robotics and Intelligent Machines. Prof. Andrea Thomaz is a pioneer in HRI and socially guided machine learning. Dr. Tracy Mitzner has almost a decade of experience conducting research on human factors and aging, with the goal of finding ways that technology can support aging. Brian Jones, the Director of the Aware Home Research Initiative, will provide supported access to the Aware Home, a free-standing home on the Georgia Tech campus that will house the PR2 for two 3-month periods so that we may conduct HRI studies and software evaluations in a realistic home environment.