Using computer vision, feedback control and learning, “Foodly” is an RT Corporation robot that is able to adapt to various challenging and novel environments. The robot is the focus of Yi Zhang’s PhD research project titled Data-driven autonomous robotic food handling.
Yi is a student of the EPSRC Centre for Doctoral Training in Agri-Food Robotics: AgriFoRwArdS, which aims to develop robotics in the broad field of agriculture, from crops to food manipulation. The aim of her research project, sponsored by RT Corporation, is to develop new data-driven control algorithms for robotic manipulation.
Using on-board 3D cameras, located in the robot's head and chest, Foodly is able to recognise different types of food in real time. It has two robotic arms that allow it to perform tasks simultaneously, such as picking up, moving and placing challenging types of food that can be of various sizes and shapes, such as vegetables, fried chicken and meatballs.
Yi said: “The central focus of the project is the coordination of perception and control algorithms to achieve reliable data-driven control for manipulation.
“These algorithms must guarantee a minimal level of performance/safety even if the sensing is corrupted, the models are uncertain, and the data is scarce. Why? Because this is the reality of the factory/food handling environment. The uncertainty is high, the environment is constantly changing, there are humans present, and the various tasks require flexibility.
“Foodly is designed to work with people. It can operate 24/7, with improved quality and consistency, safety and reliability. Foodly can share a workspace with humans due to its flexible and compact design, but it can also be used to pick up human workload on an assembly production line, for example, where workers may have been required to move to another station for extra manpower.”
RT Corporation is one of the industry collaborators of the Observatory for Human-Machine Collaboration (OHMC) – an experimental space sited at the Department of Engineering and dedicated to research in human-machine collaboration. The space is open for use by different University departments, industry and government institutions.
Narges Khadem Hosseini is the robotic Principal Technician of the OHMC, and she has been involved early on in the deployment and demonstrations of Foodly and the associated Master of Engineering (MEng) projects.
“The visually-guided robotic system is well-suited for repetitive pick-and-place tasks in food factories,” she said. “Our robotic setup comprises a robotic arm, a conveyor, and a dedicated laptop running software designed to simulate packaging operations. The robot recognises objects on a serving tray, picks them up and neatly places them into food containers that are moving on a serving line.”
Narges added: “Our compliance system is optimally designed for rapid adaptation in collaborative environments. We leverage vision-based machine learning technology, allowing our system to quickly and robustly adjust to new scenarios. The robot's speed can be adjusted to suit various tasks, even when containers are closely positioned, and the conveyor is operating at its maximum speed.”
Fulvio Forni, Professor of Control Engineering, is a co-investigator of AgriFoRwArdS CDT and one of Yi’s supervisors alongside Professor Fumiya Iida.
He said: “We think we can do well by blending data and feedback control (passivity-based control, impedance control) to deliver ‘intelligent’ and safe robot control algorithms.
“Data/learning is needed for flexibility and adaptation during task execution. Feedback control is needed to handle inherent uncertainties and to guarantee safety.
“The RT Corporation platform is the natural experimental setup/benchmark for the research of this project. RT offers low-cost articulated robotic arms equipped with grasping end-effectors that we plan to develop, control and test for speed, accuracy and reliability.”
Dr Yuki Nakagawa, CEO of RT Corporation, said: “RT Corporation is a leading developer company of collaborative factory robots in Japan. We are very pleased to start working with the University of Cambridge on this project.”
Source: cam.ac.uk