Article is available in full to IFST members and subscribers.

Register on the FST Journal website for free

Click the button to register to FST Journal online for free and gain access to the latest news


If you are an IFST member, please login through the Members Area of the IFST website. 













Manufacturing the future: robots learn food handling

Katherine Johnson and Naresh Marturi at KUKA Robotics describe some of the latest developments in automated, robotic food manufacturing.

Food manufacturers are continually exploring ways of improving processing throughput to meet increasing demand for food in a highly competitive industry. Automated processes are often preferred to manual ones, not only as a means of improving speed and reducing costs, but also to protect the environment in which food is picked, packed and dispatched to retailers.

In some food storage environments, which are unsuitable for humans to occupy for long periods of time, for example sub-zero freezer warehouses, automated solutions are necessary to ensure that operations can continue without interruption.

Historically, the processing and production of food items has been conducted by people. However, process interruption can be caused by sickness, holidays and other human needs. The introduction of an automated cell for process implementation can increase reliability and speed leading to improved productivity. It can also enhance cleanliness and hygiene. In addition, the engagement of fewer employees in manual tasks mitigates a company’s exposure to Health and Safety issues, such as injury or Repetitive Strain Injury (RSI) claims.

Automated cells are better able to manage repetitive processes and can be programmed to continuously repeat a task with high dexterity, if necessary around the clock. This can offer cost savings to the manufacturer and eliminate the high expense of multiple shifts, ensuring production output is maintained.

The introduction of an automated cell for process implementation can increase reliability and speed leading to improved productivity."

Automated robots

Automated robotic manufacturers have been developing solutions for the food industry, however there is not a ‘one key fits all’ solution. There is a need for processes to become fully automated with high speed throughput, from upstream processing to end of line palletisation. Moreover, the upsurge in demand for bespoke or artisan products means that production runs can be shorter and more diverse, leading to a manufacturing environment where change is a constant and the need for a truly connected factory – as characterised by the uptake of Industry 4.0 * – becomes ever more pertinent.

Robots have typically been used in the food industry for the completion of heavy work and laborious logistical tasks, such as lifting, packing and palletising, but advancements in the automated robotics industry are redefining the production line. Costs can be reduced significantly and product output is increased through the adoption of automated systems that can work independently. Systems designed to work in direct contact with food, using food compatible lubricants and stainless steel parts, can handle foodstuffs in environments with temperatures as low as -30 degrees. Solutions must be able to integrate with existing operations. Capabilities need to include a range of parameters, such as axis and reach, payload, repeatability, interaction, communication and vision.

Automated machine vision systems within the food industry will provide manufacturers with significant benefits."                                                                                                                                                    

KR Aglius HM cell designed for use in the food industry

Machine-vision systems 

Vision and conveyer tracking – the interpretation of the production environment within the AI arena and the decisions that come thereafter – is relatively new to automated robotics and as a solution, although still in its infancy, is beginning to take off. There is a need for robotic vision within the food industry to support processes, such as packing, wrapping, sealing etc. KUKA has been developing algorithms for industrial gripping and manipulation and real-time decision-making, for example to identify parts or items on a conveyer belt. It has been focusing on object detection, tracking and vision-guided control methodologies.

Developing algorithms for optimising the motions of multiple robot arms has been a key focus at KUKA’s state of the art training facility in Wednesbury. The aim is to pick multiple objects off moving conveyors in minimum time. Software modules and communication drivers are also under development to control the latest intelligent industrial work assistant.

Exploration of robot hand-eye coordination and movement within the workplace combines gaze control with bi manual movement and, once developed, cells could occupy an established production line managing processes synonymous with a food packaging environment, operating 24 hours a day, seven days a week, meeting output demand and satisfying health and safety regulations.

One current project, ‘Automatic gaze controller for assisting robot manipulator movement in the workspace,’ is addressing the problem of gaze control for a bi-manual robot consisting of a 7 degrees-of-freedom pan–tilt vision system (robot head) and two KUKA LWR’s (light weight robots). The control framework is comprised of two components:

1) an adaptive visual tracker that is capable of tracking an arbitrary object with unknown trajectory

2) an optimised visual control strategy capable of controlling all joint motions of a redundant head-neck mechanism in order to retain the tracked object at image centre.

The advantage of such a framework is that it does not require any prior knowledge of the object trajectory and can achieve optimal joint motions, i.e. it minimises the maximum joint motions needed to maintain constant gaze. An adaptive gain has been used with the controller in order to provide fast convergence of the task space error when gazing at an object with unpredictable trajectories. The framework has been validated in real-time under various operating conditions.

Automated machine-vision systems within the food industry will provide manufacturers with significant benefits: improving quality and product processing times, reducing waste and labour costs, high levels of consistency and speed.

Human/robot collaboration

Human/robot collaboration introduces an element of unreliability. Humans do not replicate tasks as accurately as robots, but a cell needs to be able to adapt to changes in its environment, such as an object not placed in the same position on a conveyor, a variance in timings and the angle of an object. Object tracking in real-time is one of the major tasks being developed to control the robot’s trajectory automatically. There are algorithms to track different objects, but most of them require prior information about the objects, such as their geometric primitives, texture, model information etc. Such methods are limited to track only particular objects of interest and require active tuning of process parameters to migrate them to other objects. To solve this problem, an adaptive tracker has been developed that is independent of local object features and can reliably track various objects present in ‘the scene’. The developed tracker has already been tested in different environments and results of tests have demonstrated the following properties:

1) Cell is capable of tracking multiple moving / stationary objects; current version can track 62 different objects simultaneously.

2) Real-time operation; average tracking time per frame is 1.2 milliseconds.

3) Easy to migrate – single click initialisation of objects.

Colour vision analysis

Many existing machine vision solutions in the food industry currently operate using grey scale imaging. Developments at KUKA mean that applications currently being refined will instead operate using colour vision analysis. This is an application that could be implemented in a production line consisting of multiple elements, especially in the case of simultaneous object tracking capabilities.

Human/robot collaboration object analysis in both real time and colour can also support quality control. When humans are involved in a process that includes repetitive and/or mundane tasks, the occasional error is likely to occur.

The adaptation of grasp trajectories of the robotic arm and hand with respect to changing object poses essentially provides the robot with the ability to grasp dynamic moving novel objects that have no prior models, i.e. the robot does not know that it is there. The developments are based on the ‘learned generative grasp model’, which generates a set of possible grasp trajectories for a given unknown object. The object pose is estimated by an adaptive 3D pose tracker to transform grasp trajectories into a new object frame; the robot will identify a path in order to select or manipulate an object.

developments currently taking place give rise to entirely new capabilities within the food industry: the ability to visualise, pick and place items in real time.’


As consumer demand rises and manufacturers increasingly look at ways to reduce costs and remain competitive in a fierce market, the need for robots within the food industry is likely to increase. Today robots play an essential part in the performance of primary packaging tasks, while the developments currently taking place give rise to entirely new capabilities within the food industry: the ability to visualise, pick and place items in real time.

There has been a gradual uptake of automated solutions within the food industry. Advanced manufacturing techniques are going to be essential to the continual development of automated robotics within the industry. Manual operations will remain for certain products or processes, but speed and agility are key.

There is still a need for human intervention in the production, processing and packaging of food items. Current systems are not sufficiently advanced to identify poor quality food items, whether by texture or smell. Human senses are very good at identifying whether an item is unfit for consumption, as opposed to a cell that operates based on colour, value or size. The potential for developing such sensory intelligence may be further explored in the future.

* The smart factory - automation and data exchange in manufacturing technologies including cyber-physical systems, the Internet of things and cloud computing.


Katherine Johnson and Naresh Marturi

Tel: 0121 505 9970 Email:


Katherine Johnson is a marketing professional supporting KUKA Robotics in the development of its communications strategy.

Dr Naresh Marturi has experience in developing control algorithms and techniques for a wide variety of robots ranging from nano-scale to large scale industrial robots. His major field of research relates to computer vision, where he is involved in developing object detection, tracking and vision-guided control methodologies. At KUKA, Naresh’s main role is to facilitate the transfer of knowledge and expertise in computer vision and robotics from academia to industry.


View the latest digital issue of FS&T or browse the archive


Click here

Become a member of the Institute of Food Science and Technology


IFST Twitter Feed