Robotic Manipulation for Nuclear Sort and Segregation
IROS 2016 Workshop:
Closed-loop Grasping and Manipulation: Challenges and Progress
October 14
Abstract
Despite decades of research, robust autonomous robotic grasping and manipulation at a level approaching human skills remains an elusive goal. One reason is the difficulty in dealing with the inevitable uncertainties and unforeseen situations encountered in dynamic and unstructured application environments. Unknown information required to plan grasps such as object shape and pose needs to be extracted from the environment through sensors. However, sensory measurements are noisy and associated with a degree of uncertainty. Furthermore, object parameters relevant to grasp planning, such as friction and mass, may not be accurately estimated. In real-world settings, these issues can lead to grasp failures with serious consequences.
Additionally, physical interaction during object manipulation in many scenarios (e.g., daily life, industrial settings), can go well beyond grasping for picking and placing. Examples of these tasks include opening a door, screwing a light bulb, using a tool and writing. They may require actions such as caging, pushing, sliding, etc. Task accomplishment depends not only on the contact interactions between object and the manipulator/hand/finger, but also in particular on the exploitation of the environments. Recent studies to achieve such tasks are scattered across many different topics: passive mechanical adaptation through compliant hand design, caging, pushing, extrinsic dexterity, exploiting environmental constraints and dynamic grasp adaptation, etc. Despite these efforts, there is still no generic, principled approach to encode varying task constraints at the same time in order to deal properly with the various uncertain circumstances during interaction. Skills to reason about the task requirements, to ground these in the sensory information, to devise reactive behaviors in order to respond compliantly to uncertainties, to choose and execute optimal actions are crucial for reliable physical interactions such as object manipulation. In order to tame the complexity, most of these issues have been studied in isolation and most object manipulation systems are still open loop. This greatly limits the applicability of such approaches in real-world settings.
This workshop focuses on task representation of object manipulation to enable multi-contact manipulation planning and reactive behaviors that can eventually close the loop between planning and control and lead to robust manipulation that can deal with uncertainty. The aim is to connect researchers from different backgrounds such as planning, control, learning, design and perception in order to set the basis and define core open problems for object manipulation. Furthermore, we want to discuss advantages, limitations, challenges and progress of different approaches pertaining to the workshop topic.
Call for Papers
We welcome the submission of two page extended abstracts describing new or ongoing work. Final instructions for poster presentations and talks will be available on the workshop website after decision notifications have been made. Accepted presenters will have the option of submitting a full length six-page paper. All contributed papers will be accessible on the workshop website. Submissions should be in .pdf format. Please send submissions to Y.Bekiroglu[at] bham.ac.uk with the subject line "IROS 2016 Workshop Submission”. For any questions or clarifications, please contact the organizers.
Workshop Agenda
The aim is to bring together researchers with different approaches to deal with grasping and
manipulation under uncertainty. The proposed speakers have been selected to cover a wide
range of methods in grasping and manipulation. Invited speakers will get a 20 minute slot to present their latest work relevant to the workshop theme. We will organize an interactive poster session, allowing more contributors to present their work and have detailed discussions with the workshop attendees. Contributed papers will be published on the website and invited to present a poster. We will have a panel discussion at the end where participants can discuss open questions on the state of the art in grasp and manipulation planning and the proposed directions of new research efforts that will be raised during the talks by the speakers. We hope to include many contributors and have fruitful discussions between invited speakers, active contributors, and the audience.
​
​
Abstract submission deadline: September 23, 2016
Acceptance notification: September 30, 2016
Final materials due: October 7, 2016
Workshop date: October 14, 2016
Important Dates
Speakers
Schedule
Organizers
8:55 - 9:05 Opening
9:05 - 9:30 Invited Talk: Kenji Tahara
Title: A simple dynamic object grasping and manipulation controller and its applications
Abstract: The talk will discuss about dynamic object manipulation methods based on fingers-thumb opposability based controller. By using rolling contacts of fingertips, a simple grasping and manipulation controller can be designed.
The basic idea of our approach is shown and its applications are introduced to discuss about the importance of the dynamic manipulation controller and its usefulness.
9:30 - 10:00 Poster Teasers
Martin Saska, Tomas Baca, Vojtech Spurny, Embedded model predictive control technique for grasping of objects with unknown weight by team of MAVs
Ioannis Havoutis, Ajay Kumar Tanwani, Sylvain Calinon, Online Incremental Learning of Manipulation Tasks for Semi-Autonomous Teleoperation Full length version:
Yevgen Chebotar, Karol Hausman, Oliver Kroemer, Gaurav S. Sukhatme, Stefan Schaal, Supervised Policy Fusion with Application to Regrasping
Karl Van Wyk, Comparative Peg-in-Hole Testing of a Force-based Manipulation Controlled Robotic Hand
Signals Pedro Piacenza, Weipeng Dang, Emily Hannigan, Jeremy Espinal, Ikram Hussein, Ioannis Kymissis and Matei Ciocarlie, Tactile Sensing with Overlapping Optical Signals
Tetsugaku Okamoto, Kyo Kutsuzawa, Sho Sakaino, and Toshiaki Tsuji, Trajectory Planning Method for Object Manipulation Considering Dynamic Constraint with Object
10:00 - 10:30 Coffee and Posters
10:30 - 10:55 Invited Talk: Oliver Brock
Title: Manipulation Nouvelle
Abstract: Manipulation, including grasping, has long been dominated by point contacts, geometric models, fully actuated hands, and form/force closure. These concepts have brought about fundamental advances and form the theoretical basis of manipulation research today. Fueled by evidence from human grasping, research has called into question the exclusiveness or completeness of these concepts: instead of point contacts, large surface contacts have proven to increase robustness; instead of precise geometric models, simple geometric features have proven sufficient for successful grasping and manipulation; instead of fully actuated hands, underactuated, compliant hands seem to be the state of the art today; and instead of form/force closure, the notion of funnels describing interactions between hand/object/environment seem to have taken center stage. In this talk, I will review these developments, describe their origins, and speculate on whether we are moving towards a fundamental change in manipulation---towards a “Manipulation Nouvelle.”
10:55 - 11:20 Invited Talk: Yu Sun
Title: Robotic Grasping for Daily-Living Manipulation Tasks
Abstract: Traditionally, robotic grasping and manipulation approaches have been successful in planning and executing pick-and-place tasks without any physical interaction with other instruments or the environment. When robots move into our daily-living environment and perform a broad range of tasks in an unstructured environment, all sorts of physical interactions will occur, which will result in physical interactive wrench: force and torque on the instrument in a robotic hand. This talk will introduce two grasp quality measures that are derived from the two manipulation requirements: interactive wrench requirements and motion requirements for accomplishing a manipulation task. The requirements are from the voluntary and involuntary physical interactions in instrument manipulations and directly associated with
the functionality of the instrument and the manipulation task, but independent from the robotic hardware. Working with hardware-independent grasping strategies extracted from human demonstration including grasp type and thumb placement, optimal grasps could be located efficiently in a dramatically reduced hand configuration space.
11:20 - 11:45 Invited Talk: Kensuke Harada
Title: Planning Industrial Bin-picking Tasks
Abstract: With this talk, we introduce our recent results on motion planning for industrial bin-picking tasks. Our talk is composed of the basic theory and its application. We first show how to deal with the uncertainty on object shape. Next we show a learning based method. Finally, we will show an application of our planning method to industrial senario.
11:45 - 12:10 Invited Talk: Tetsuyou Watanabe
Title: Soft-hard hybrid fingertip structure improves stability and reduces uncertainty in object grasping.
Abstract: Fingertip or contact softness can absorb the uncertainties of object shape, object pose, contact impact, and contact deviation, and provide stable grasping. This is a key for stable grasping unknown objects in unknown environment. However, the issues such as uncertainties of contact positions, low contact forces come up with the benefit. Hard fingertip can have the opposite features. With this in mind, new soft-hard hybrid fingertip structure that can include both benefits of soft and hard fingertips was developed. Here this new structure will be introduced, and how it works and the control strategy for the structure will be presented.
12:10 - 12:35 Invited Talk: Tamim Asfour
Title: On the Duality between Grasping and Balancing
Abstract: The talk will discuss the parallelism between grasping and whole-body balancing. Inspired by the idea that a stable whole-body configuration of a humanoid robot can be seen as a stable grasp on an object, we present taxonomy for whole-body poses, whole-body grasps, its validation based on human motion capture data and its use to generate multi-contact whole-body grasps. Further, we show how co-joint object-action representations used for object grasping can be extended to associate whole-body actions/grasps with scene affordances. We demonstrate how affordance hypotheses are generated by visual exploration and verified using haptic feedback.
12:35 - 14:05 Lunch
14:05 - 14:30 Invited Talk: Todor Stoyanov
Title: Hand Posture Constraint Envelopes: Towards Reactive Grasp Acquisition Control
Abstract: Grasping systems that build upon meticulously planned hand postures rely on precise knowledge of object geometry, mass and frictional properties – assumptions which are often violated in practice. In addition, relying on per-object databases of discrete pre-planned hand postures invariably leads to a loss of information: in the form of redundant, yet equally suitable hand postures. When grasps are later selected upon runtime, this may result in failures, conditioned on the sampling density and the complexity of the workspace environment configuration. In this talk, we will explore an alternative parametrization, based on grasp envelopes: sets of constraints on gripper postures. We will discuss how the constraint-based formulation can easily be integrated in a control framework, to achieve fast online trajectory generation. Finally, we will examine the role of perception in the grasp acquisition framework and present a fast method to adapt grasp envelopes to different target objects and workspace conditions.
14:30 - 14:55 Invited Talk: Daniel Kappler
Title: Leveraging Big Data to Learn Stable and Discriminative Grasps
Abstract: In the data-driven methodology towards robotic manipulation, predictive models are learned from labeled training data. Such methods typically infer the quality of grasp candidates under real-world conditions such as incomplete knowledge of the object as well as uncertainty in sensing and actuation. Different to the common classification approach, towards learning grasp stability prediction, we propose to formulate this problem as a ranking objective. We leverage an existing large-scale database, containing grasps generated in simulation annotated with a physics-metric and simulated point-clouds. For each point-cloud, our ranking formulation optimizes to rank one successful grasp the highest. Thus, reducing the learning complexity by focusing only the most discriminative successful example for each point-cloud. Therefore, we increase the robustness to uncertainty in sensing.
14:55 - 15:20 Invited Talk: Serena Ivaldi
Title: Grasping, vision and interaction for object manipulation with iCub
Abstract: In this talk I will present how we approached the problem of object manipulation for iCub from three different angles: first, by making the robot learn the visual appearance of the objects combining vision and actions; second, improving autonomous grasping by explicitly considering the uncertainty in the object’s localisation, induced by the noisy cameras; third, by leveraging human-robot physical interaction, which enables even non-experts to teach the robot how to assemble a two-parts object.
15:20 - 16:00 Coffee and Posters
16:00 - 16:25 Invited Talk: Alberto Rodriguez
Title: Feedback Control of the Pusher-Slider System: A Story of Hybrid and Underactuated Contact Dynamics
​
16:25 - 16:50 Invited Talk: Maximo Roa
Title: Improving grasp robustness through planning and compliant control
Abstract: The talk will discuss two main aspects for achieving grasp robustness. From the planning side, the use of physically meaningful quality measures for planning will be discussed. From the control side, the theoretical framework and experimental evaluation of passivity-based controllers for multi-fingered hands will be presented. The combination of the two aspects allows robust grasps that cope with positional uncertainties of the object.