Introduction

There is a great excitement surrounding data-driven techniques for perceptual classification, inference, and motor control. These techniques come to robotic manipulation with the promise of enabling behavior with greater robustness, performance, and adaptability, as well as suggesting new representations for physical interaction. Recent excitement in the lab, however, is tempered by significant challenges faced when building practical data-driven robots. This workshop sets the focus on those challenges involved in making the data-driven approach work for robotic manipulation.

Robot manipulation is a useful "petri dish" to study data-driven systems. Hands, or end-effectors, are where the "rubber hits the road"---where robots make and break contact with the world; and where visual, tactile, and proprioceptive feedback combine. While interacting with the environment, the robot is exposed to a great deal of data-information that is challenging to collect, maintain, organize, and use. On one end, we need an already functional robotic system to start capturing data. In the process, the system is prone to degrade and/or break. On the other end, the dynamics and perceptual feedback from robotic manipulation systems yield multi-modal data that is complicated to make sense of. The goal of this workshop is to identify the challenges that are preventing data-driven robotic manipulation from experiencing the same performance jump as other fields that have embraced it, and what can we do to overcome them.

Focused Workshop Topics

  • Multi-modality in Dynamics and Perception. Physical systems that make and break contact, that are exposed to frictional forces that stick and slip, and that in general suffer from discrete reconfigurations, express multi-modal behavior. This hybridness, presented in the form of complicated statistics of traces of states and perceptual cues, is challenging to learn with the standard and well-understood smooth regression models.
  • Data Degrades and Physical Systems Break. In robotic manipulation, useful action and meaningful sensing are often intrusive and sometimes aggressive. Contact produces frictional and impact forces that over time can change or break the system, making its dynamics a \emph{moving} target for identification and learning algorithms. The value of data hence can diminish over time.
  • The High Cost of Data. There is no shortcut to the fact that to capture even a first datapoint, we need an already functional system that can engage in physical interaction in a purposeful and safe manner. Collecting a few datapoints poses the challenge of having to be selective, accurate, and incremental for them to be meaningful. Collecting a large amount of datapoints requires automation which, except in particularly constrained scenarios, might be as challenging as the original task.
  • Agreements and Disagreements between Physics and Data. General wisdom says that data and physics should work together, one providing accuracy and the other generality. When it comes to practice, it is challenging not resorting to ad-hoc heuristics to combine them and resolve their disagreements.

Call for Contributions

We are soliciting 2 page abstracts on recent work that consider how robots can learn from empirical data sets. Accepted contributions will be presented during short spotlight talks and at the poster session. Exceptional submissions will be considered for oral presentation. Abstracts should be submitted as PDF files using the same format as for the main RSS conference submissions.
 
Important Dates:
  • submission deadline: June 20, 2017
  • notification: June 25, 2017
  • camera ready: July 3, 2017
  • workshop: July 16, 2017

Abstracts should be submitted by email to: rss2017.manipulation@gmail.com

(Awesome) Speaker Lineup

  • Pieter Abbeel (Berkeley)
    Pieter Abbeel has developed apprenticeship learning algorithms for helicopter aerobatics, and enabled the first end-to-end system for picking up a crumpled laundry article and folding it. His current research focuses on robotics and machine learning with a particular focus on challenges in personal robotics, surgical robotics and connectomics. 

  • Siddhartha Srinivasa (UW)
    Siddhartha Srinivasa seeks to enable robots to robustly and gracefully interact with the world to perform complex tasks in uncertain, unstructured, and cluttered environments. He wants to make this interaction faster, safer, elegant, and involve simpler actuation. Talk's title: Integrating Models and Data for Robust Manipulation with and around people.

  • Kris Hauser (Duke)
    Kris Hauser's research includes robot motion planning and control, semiautonomous robots, and integrating perception and planning, as well as applications to intelligent vehicles, robotic manipulation, robot-assisted medicine, and legged locomotion. Talk's title: Can we quantify the hardness of learning manipulation?

  • Jan Peters (TU-Darmstadt)
    Jan Peters' research lies at the intersection between machine learning and robotics. His research has been recognized with multiple awards, and he is very active in organizing venues for  bring members of both fields together. Talk's title: Machine learning for tactile manipulation.

  • Aude Billard (EPFL)
    Aude Billard is interested in the control and design of robotic systems that interact with humans. She pursues three research areas: a) control systems for teaching robots through human demonstration; b) neural and cognitive processes in human imitation learning; c) user-friendly human-computer interfaces to facilitate human-robot interaction. Talk's title: Reducing the high cost of data using human demonstrations: When is this a solution, and when not?

  • Byron Boots (GaTech)
    Byron Boots performs research in machine learning, artificial intelligence, and robotics with a focus on theory and systems that tightly integrate perception, learning, and control. He works on a range of problems including computer vision, system identification, forecasting, simultaneous localization & mapping, motion planning, and optimal control.

  • Jiaji Zhou (CMU)
    Jiaji Zhou is a PhD student in the CMU Manipulation Lab. He is interested in understanding the fundamental mechanics related to robotic manipulation as well as any technique that can improve autonomous robotic manipulation capabilities. Talk's title: Physics-inspired prior models for robotic manipulation.

Schedule

Topic 1: The High Cost of Data
9:30 Introduction by organizers
9:35 Speaker 1: Aude Billard
9:55 Speaker 2: Pieter Abbeel
10:15 Discussion co-directed by speakers 1 and 2
10:30 Coffee Break
Topic 2: Priors and model complexity
11:00 Introduction by organizers
11:05 Speaker 3: Jiaji Zhou
11:25 Speaker 4: Byron Boots
11:45 Discussion co-directed by speakers 3 and 4
12:00 Lunch break
Highlight Talks and Posters
14:00 Introduction by organizers
14:05 Highlight Talks from poster presenters
14:40 Poster session
15:00 Coffee Break / Poster session
Topic 3: Agreements and Disagreements between Physics and Data
15:30 Introduction  by organizers
15:40 Speaker 5: Jan Peters
16:00 Speaker 6: Kris Hauser
16:20 Speaker 7: Siddhartha Srinivasa
16:40 Discussion co-directed by speakers 5, 6 and 7
17:10 Closing remarks

Invited Posters

Organizers

Supported by