QUT researchers building robots that can grip

QUT researchers who are working on how robots can be taught to grasp objects in real-world situations, have received more than $98,000 in funds from Amazon.

The Amazon grant coincides with the publication of a separate study involving QUT researchers on robot grasping.

QUT robotics researcher Professor Peter Corke, founding director of the Australian Centre for Robotic Vision headquartered at QUT, and Research Fellow Dr Jürgen Leitner, received the Amazon award in recognition of their world-leading research into vision-guided robotic grasping and manipulation.

Corke and Leitner led a centre team to victory in winning the first prize at the 2017 Amazon Robotics Challenge in Japan, in which they received more than $110,000.

READ: Queensland school builds centre for automation and robotics

“Real-world manipulation remains one of the greatest challenges in robotics,” said Corke.

“So, it’s exciting and encouraging that Amazon is throwing its support behind our work in this field.”

Leitner, who heads up the Centre’s Manipulation and Vision program, described the Grasping with Intent project, which has been recognised with the Amazon grant, as ambitious and unique.

“While recent breakthroughs in deep learning have increased robotic grasping and manipulation capabilities, the progress has been limited to mainly picking up an object.

“Our focus moves from grasping into the realm of meaningful vision-guided manipulation. In other words, we want a robot to be able to seamlessly grasp an object ‘with intent’ so that it can usefully perform a task in the real world,” said Leitner.

“Imagine a robot that can pick up a cup of tea or coffee, then pass it to you.”

The study published in Science Robotics, by a research team of The BioRobotics Institute of Scuola Superiore Sant’Anna and the Australian Centre for Robotic Vision, reveals guiding principles that regulate choice of grasp type during a human-robot exchange of objects.

The study analysed the behaviour of people when they have to grasp an object and when they need to hand it over to a partner.

The researchers investigated the grasp choice and hand placement on those objects during a handover when subsequent tasks are performed by the receiver.

In the study, people were asked to grasp a range of objects and then pass them to another person. The researchers looked at the way people picked up the objects, including a pen, a screw driver, a bottle and a toy monkey, passed them to another person and how the person then grasped those objects.

Passers tend to grasp the purposive part of the objects and leave “handles” unobstructed to the receivers. Intuitively, this choice allows receivers to comfortably perform subsequent tasks with the objects.

The findings of the research will help in the future design of robots that have the task of grasping objects and passing them.

Corke said real-world manipulation remained one of the greatest challenges in robotics.

“We strive to be the world leader in the research field of visually-guided robotic manipulation.

“While most people don’t think about picking up and moving objects – something human brains have learned over time through repetition and routine – for robots, grasping and manipulation is subtle and elusive,” said Corke.