Classification of Household Materials via Spectroscopy: Recognizing an object’s material can inform a robot on how hard it may grasp the object during manipulation, or if the object may be safely heated up. We explore a technique for robots to estimate the materials of objects using spectroscopy. We demonstrate that spectrometers provide several benefits for material recognition, including fast sensing times and accurate measurements with low noise. Furthermore, spectrometers do not require direct contact with an object.
Deep Haptic Model Predictive Control: Robot-assisted dressing offers an opportunity to benefit the lives of many people with disabilities, such as some older adults. However, robots currently lack common sense about the physical implications of their actions on people. We present a deep recurrent model that, when given a proposed action by the robot, predicts the forces a garment will apply to a person’s body. The predictions made by our model only use haptic and kinematic observations from the robot’s end effector. Collecting training data from real world physical human-robot interaction can be time consuming, costly, and put people at risk. Instead, we train our predictive model in an entirely self-supervised fashion using a physics-based simulation.
Tracking Human Pose using Capacitive Proximity Sensing: While several robotic systems have explored robot-assisted dressing, few have considered how a robot can adapt to human motion in real time during dressing assistance. In addition, estimating pose changes due to human motion can be challenging with vision-based techniques since dressing is often intended to visually occlude the body with clothing. We present a method to track a person’s pose in real time using capacitive proximity sensing. This sensing approach gives direct estimates of distance with low latency, has a high signal-to-noise ratio, and has low computational requirements. We also show that a capacitive sensor is unaffected by visual occlusion of the body and can sense a person’s body through fabric clothing.
Semi-Supervised Haptic Material Recognition: Material recognition enables robots to incorporate knowledge of material properties into their interactions with everyday objects. For instance, material recognition opens up opportunities for clearer communication with a robot, such as “bring me the metal coffee mug”, and having the ability to recognize plastic versus metal is crucial when using a microwave or oven. However, collecting labeled training data can be difficult with a robot, whereas many forms of unlabeled data could be collected relatively easily during a robot’s interactions. We present a semi-supervised learning approach for material recognition that uses generative adversarial networks (GANs) with haptic features such as force, temperature, and vibration.
Anomaly Detection: We introduce a new anomaly detection method using multimodal sensing information for manipulation tasks. By using multiple sensory modalities, a robot could potentially detect a wider variety of anomalies, such as anomalous contact or a loud utterance. Our method models the spatiotemporal dynamics of the sensor information using a HMM and trains a classifier using HMM-induced feature vectors.
Haptic Perception: In this work, we are focusing on methods for haptic perception during ‘incidental contact’. By ‘incidental contact’, we mean contact that is not central to the robot’s current actions and may occur unexpectedly or unintentionally. We are looking at multi-modal sensory modalities (both contact and non-contact sensing) to infer properties of the world during ‘incidental contact’. We use both model-based as well as data-driven approaches for haptic perception. Here’s a link to all the datasets, hardware designs, papers, and other materials related to this project.
Handheld Data Acquisition Device: Data-driven approaches for haptic perception have shown promise, but suitable training data is lacking. To help address this challenge, we developed a portable handheld device for the efficient acquisition of haptic sensing data from objects in their natural settings. More details here.
Robot-assisted Dressing: The primary goal of the work is to develop assistive robotic technologies for the task of dressing humans. We plan to use efficient physics simulation and optimization tools to substantially automate the design of assistive robots for dressing. Our approach consists of three integrative technical components: assistive robot design, human modeling, and physics simulation.
Web-based Assistive Teleoperation: Assistive mobile manipulators (AMM’s) have the potential to help persons with severe motor impairments perform a variety of assistive tasks, increasing independence and reducing the burden on caregivers. However, AMM’s are complex systems with many sensors and actuators, and so are often difficult to use for non-experts, diminishing potential benefits and limiting adoption. Our ongoing work aims to overcome these barriers through user-centered design and semi-autonomous robotic capabilities, including the development of a robotic undo function to mitigate errors and so improve task performance.
Robot for Partnered Stepping: Our vision is to develop caregiver robots that interact fluidly and flexibly with humans during functional motor activities, while providing motor assistance, enhancement, and communication to facilitate motor learning. Our goal is to study human motor coordination during cooperative physical interactions with a humanoid assistive robot. We will use rehabilitative partner dance as a paradigm to examine a sensory-motor theory of cooperative physical interactions relevant to walking and other functional motor activities.
Autobed: The Autobed is a modified standard electric hospital bed that can be controlled by a web-interface. The bed can also communicate its pose in real-time using the Robot Operating System (ROS) interface. The Autobed in this closed-loop form can thus communicate with a humanoid robot to better perform a caregiving task with the patient. By adding a host of sensors to the Autobed, we convert the bed into a basic robot capable of communicating its pose and occupancy status to a mobile robot.
Force and Thermal Sensing Skin: Robots could benefit from skin with both force and thermal sensing, such as when manipulating objects in human environments. We designed a novel fabric-based skin for robots that combines force and thermal sensing. The skin’s stretchable fabric-based design enables it to cover curved surfaces on the robot and conform to manipulated objects to improve sensing. Our design incorporates small self-heated temperature sensors on the surface of the skin that directly make contact with objects, thereby improving the sensors’ temporal efficiency. More details here.
Stretchable Fabric-based Skin: We are attempting to create a new foundation for robot manipulation that encourages contact between the robot’s arm and the world. A key aspect of our approach is that we assume the robot has compliant joints and whole-arm tactile sensing. To support our research, we have developed tactile sensor arrays that are easy to fabricate, easy to use, and inexpensive relative to other large tactile sensing arrays. This has enabled us to quickly prototype tactile sensor arrays and give multiple robots whole-arm tactile sensing. More details here.
Robots in Clutter: Clutter creates challenges for robot manipulation, including a lack of non-contact trajectories and reduced visibility for line-of-sight sensors. We demonstrate that robots can use whole-arm tactile sensing to maneuver within clutter, while keeping contact forces low.
Robots for Humanity: Henry Evans is a mute quadriplegic, having suffered a stroke when he was just 40 years old. Following extensive therapy, Henry regained the ability to move his head and use a finger, which allows him to operate computers. We are currently exploring ways for Henry to use a PR2 robot as his surrogate.
Assistive Mobile Manipulation: There is a growing need in society to enable older adults to remain in an independent living environment. Many older adults fear losing their independence and being required to move to an assisted living facility. The proposed research will consist of two closely integrated thrusts: one devoted to human-robot interaction and the other focused on software development. Both thrusts will be directed toward the development of assistive capabilities for the PR2 robot, with an emphasis on home care for older adults.
PR2 Playpen: This system is designed to allow the PR2 to manipulate multiple objects over long periods of time without human interaction. For five days in the month of July, 2011, we ran the playpen with overhead grasping every night for 12 hours. This meant that each night the PR2 attempted around 800 grasps. The system ran autonomously for 60 total hours while attempting over 4000 different grasps. Data available here.
EL-E : An Assistive Robot: Objects play an especially important role in people’s lives. Objects within human environments are usually found on flat surfaces that are orthogonal to gravity, such as floors, tables, and shelves. EL-E is an assistive robot that is explicitly designed to take advantage of this common structure in order to retrieve unmodeled, everyday objects for people with motor impairments.
Robotic Nurse Assistant: There is a well-documented shortage of nurses and direct-care workers in the U.S. and around the world, which is expected to become more problematic as the older adult population grows and prepares for retirement. We believe robotics can play a role in assisting nurses to complete their daily tasks in order to provide better healthcare.
ROS Commander: We introduce ROS Commander (ROSCo), an open source system that enables expert users to construct, share, and deploy robot behaviors for home robots. A user builds a behavior in the form of a Hierarchical Finite State Machine (HFSM) out of generic, parameterized building blocks, with a real robot in the develop and test loop.
Biologically Inspired Assistive Robotics: Service animals have successfully provided assistance to thousands of motor-impaired people worldwide. As a step towards the creation of robots that provide comparable assistance, we present a biologically inspired robot system capable of obeying many of the same commands and exploiting the same environmental modifications as service dogs.
Haptic Teleoperation: We expect that haptic teleoperation of compliant arms would be especially important for assistive robots that are designed to help older adults and persons with disabilities perform activities of daily living (ADL). We did this work jointly with HumAnS Lab.
Robotic Playmates: Robotic playmates capable of physically manipulating toys have the potential to engage children in therapeutic play and augment the beneficial interactions provided by overtaxed care givers and costly therapists.
Object Benchmark Set: Everyday household objects are ranked based on interviews with 25 ALS patients from the Emory ALS Center.
Clickable World Interface: We have developed a novel interface for human-robot interaction and assistive mobile manipulation. The interface enables a human to intuitively and unambiguously select a 3D location in the world and communicate it to the robot.
RFID in Robotics: Passive Ultra-High Frequency (UHF) RFID tags are well matched to robots’ needs. Unlike low-frequency (LF) and high-frequency (HF) RFID tags, passive UHF RFID tags are readable from across a room, enabling a mobile robot to efficiently discover and locate them.
Dusty – A Low-cost Teleoperated Robot: People with motor impairments have consistently ranked the retrieval of dropped objects from the floor to be a high priority task for assistive robot. To meet this need, we have been developing an inexpensive teleoperated robot that is able to effectively retrieve dropped objects since 2008. We call this robot “Dusty”.