Sestosenso – Physical Cognition for intelligent control and safe human-robot interaction

Project Title: Sestosenso – Physical Cognition for intelligent control and safe human-robot interaction

Project Duration:

Project Description: Sestosenso develops technologies for next generations of collaborative robots capable of self-adapting to different time-varying operational conditions and capable of safe and smooth adaptation from autonomous to interactive when human intervention is required either for collaboration or training/teaching. The project proposes a new sensing technology from the hardware and up to the cognitive perception and control levels, based on networks of embedded proximity and tactile sensors on the robot body, providing a unified proxy-tactile perception of the environment, required to control the robot actions and interactions, safely and autonomously. Within the project, the same technologies are also applied to wearable devices (like exoskeletons) to provide the user with better spatial awareness and to enforce safety in critical human-robot interactive tasks.

  • Use case 1: COBOT – Worker cooperative assembly: In vehicle assembly operations, the process involves workers and several tasks characterized by largely different postures, workloads, and complexity often requiring exoskeletons to reduce the biomechanical load. Additionally, it can be improved by COBOTs that either support heavy parts such as the roof or pick components and tools for the workers. In this context, the concomitant use of exoskeletons that oversize the workers’ body, represents a possible cause of collision and damage with the other worker and possibly with the COBOT inside the cockpit. This limits the movements of the workers and slows down the process. Moreover, the assistance in keeping the position of the roof and the motions for the pick-n-delivery of components of the COBOTS, create a complex dynamical changing environment where COBOTs, exoskeletons and workers, need to cooperate safely and efficiently. Use Case 1 will include innovative artificial intelligence for control strategies, and the adequate sensorization of both COBOTs and exoskeletons.
  • Use case 2: Dual arm handling of large objects: Use Case 2 introduces the robotic handling and manipulation of large and bulky objects in a warehouse setting. This challenge is provided by Ocado Group, and is inspired by the need to pick and pack products in the setting of an online grocery fulfilment centre. Large, bulky, and often also heavy objects cannot be manipulated or lifted using a single robot arm. Hence, this use case introduces the use of a bi-manual robotic manipulation setup equipped with a sensing skin developed by the SESTOSENSO consortium. Towards the end of the SESTOSENSO project, the research outcomes will be demonstrated in the manipulation and handling of large objects that takes into account different geometric and dynamical properties of the items, as well as exploiting enhanced perception and control capabilities developed by the members of the consortium.
  • Use case 3: Agricultural harvesting via wearable devices and collaborative mobile manipulators: The proposed case study refers to agriculture industry where a worker has to pick and store fruits from a plant during harvesting. In particular, we focus on grape harvesting in a (hillside) vineyard, a common work in central Europe. Agriculture is indeed one of the most hazardous sectors in terms of biomechanical overloads. Musculoskeletal diseases are caused mainly by manual handling, heavy physical work, awkward postures, and repetitive movements. These situations are even more aggravated in a hilly/mountain environment. Automation and assistance systems can be a tool for supporting farmers in improving physical ergonomics. Use Case 3 envisions a system composed of a worker equipped with a passive and sensorized exoskeleton interacting with an autonomous mobile manipulator that operates as a physical assistance system. A collector (small basket), usually held by the worker during the collection of grapes, is now held by the manipulator that properly locates and supports it in the workspace. The main human’s features and work conditions are used as data for the reduction of biomechanical loads.

Contact personArvind Gurusekaran