The iCub received considerable funding through external agencies, primarily from the EU Commission “Robotics and AI” and the Future and Emerging Technologies (FET) programs [https://ec.europa.eu/digital-single-market/en/future-emerging-technologies-fet]. Collaborative projects helped defining and shaping our large network of scientists working with the iCub on themes related to humanoid robotics, cognition and embodied artificial intelligence.
The proposed project aims at taking a fundamental step in this direction: developing robots able to coordinate their behaviors with those of their human partners. Today this represents a frontier issue also and specifically for HRI and social robotics, which see human-robot "behavioral coordination" as a sine qua non condition to integrate robots in our social context(s).
ANDY leverages these technologies and strengthen the European leadership by endowing robots with the ability to control physical collaboration through intentional interaction. These advances necessitate progresses along three main directions: measuring, modeling and helping humans engaged in intentional collaborative physical tasks.
The RobMoSys project envisions an industry-grade component development and integration ecosystem that goes beyond the current limitations of software frameworks like ROS and YARP, and that is able to support the expectations of evolving robotics technology and applications. Consumer robotics for assistive living represents a promising subfield, one with an expected relatively short time to mass-marketing, and one which could currently benefit the most from the RobMoSys approach.
The CHRIS project will address the fundamental issues which would enable safe Human Robot Interaction (HRI). Specifically this project addresses the problem of a human and a robot performing co-operative tasks in a co-located space, such as in the kitchen where your service robot stirs the soup as you add the cream. These issues include communication of a shared goal (verbally and through gesture), perception and understanding of intention (from dextrous and gross movements), cognition necessary for interaction, and active and passive compliance.
CoDyCo aims at advancing the current control and cognitive understanding about robust, goal-directed whole-body motion interaction with multiple contacts. CoDyCo will go beyond traditional approaches: (1) proposing methodologies for performing coordinated interaction tasks with complex systems; (2) combining planning and compliance to deal with predictable and unpredictable events and contacts; (3) validating theoretical advances in real-world interaction scenarios.
The visually impaired and the elderly, often suffering from mild speech and/or motor disabilities, are experiencing a significant and increasing barrier in accessing ICT technology and services. Yet, in order to be able to participate in a modern, interconnected society that relies on ICT technologies for handling everyday issues, there is clear need also for these user groups to have access to ICT, in particular to mobile platforms such as tablet computers or smart-phones. The proposed project aims at developing and exploiting the recently matured and quickly advancing biologically-inspired technology of event-driven, compressive sensing (EDC) of audio-visual information, to realize a new generation of low-power multi-modal human-computer interface for mobile devices.
The ergoCub project aims at developing the next generation of humanoid robots and wearable technologies to minimize the biomechanical risks of future workers in healthcare and indutry environemnts while maximizing psychosocial acceptability aspects.
The Experimental Functional Android Assistant (EFAA) project will contribute to the development of socially intelligent humanoids by advancing the state of the art in both single human-like social capabilities and in their integration in a consistent architecture. The EFAA project proposes a biomimetic, brain-inspired approach. The central assumption of EFAA is that social robots must develop a sense of self as to overcome the fundamental problem of social inference. It is only in possessing the core aspects of a human-like self, that inferences about others can be made through analogy.
The EFAA project will apply and benchmark its cognitive architecture on a mobile humanoid assistant based on the iCub platform.
eMorph is a three-year project funded by the Seventh Research Program (FP7) of the European Union. It involves 4 research groups in 3 countries. eMorph is one of several projects funded under the FP7 initiative on Embodied Intelligence. he goal of the eMorph project is to design asynchronous vision sensors with non-uniform morphology, using analog VLSI neuromorphic circuits, and to develop a supporting data-driven asynchronous computational paradigm for machine-vision that is radically different from conventional image processing.
What’s the ethical, social and legal impact of the use of Artificial Intelligence, Robotics or Big data in the Public Sector, and how to deal with it? How can we correctly evaluate and govern the ethical implications of disruptive technologies (DTs) adoption by Public Administrations and Public Service providers, taking into account the impact that these technologies can have on citizens’ lives? How can we securely manage disruptive technologies and the data they process? Our approach aims at making these tools as practical as possible to support PAs with a responsible adoption of DTs through a careful assessment of the trustworthiness of their development and use.
To this end, these tools will be co-designed with public bodies and the involvement of Digital Innovation Hubs (DIHs) and universities.
HEAP project will provide scientific advancements for benchmarking, object recognition, manipulation and human-robot interaction. We focus on sorting a complex, unstructured heap of unknown objects --resembling nuclear waste consisting of a set of broken deformed bodies-- as an instance of an extremely complex manipulation task. The consortium aims at building an end-to-end benchmarking framework, which includes rigorous scientific methodology and experimental tools for application in realistic scenarios. To improve object recognition and segmentation in cluttered heaps, we will develop new perception algorithms and investigate interactive perception in order to improve the robot's understanding of the scene in terms of object instances, categories and properties.
Cognitive Systems, Interaction, Robotics Investigating and developing efficient robot strategies to facilitate the acquisition of motor skills The HUMOUR project will investigate and develop efficient robot strategies to facilitate the acquisition of motor skills. It will address both the (human) trainee and the (robot) trainer sides, by combining behavioural studies on motor learning and its neural correlates with design, implementation, and validation of robot agents that behave as optimal trainers, which efficiently exploit structure and plasticity of the human sensorimotor systems. Robot trainers will be validated in the context of motor skill learning and robot-assisted rehabilitation. This will benefit large groups of individuals, by helping professionals, e.g. surgeons, to acquire delicate motor skills; by providing older persons greater access to activities like fitness, sports, and arts, thus ultimately improving their quality of life.
INSTANCE will break new ground in social cognition research by identifying factors that influence attribution of mental states to others and social attunement with humans or artificial agents. The objectives of INSTANCE are to (1) determine parameters of others’ behaviour that make us attribute mental states to them, (2) explore parameters relevant for social attunement, (3) elucidate further factors – culture and experience – that influence attribution of mental states to agents and, thereby social attunement.
The ITALK project aims to develop artificial embodied agents able to acquire complex behavioural, cognitive, and linguistic skills through individual and social learning. This will be achieved through experiments with the iCub humanoid robot to learn to handle and manipulate objects and tools autonomously, to cooperate and communicate with other robots and humans, and to adapt to changing internal, environmental, and social conditions.
State-of-the-art robotic technologies rely on preprogrammed models and largely lack the capacity to adapt to unexpected changes to their bodies or the environment. In order to expand the domain where they can be applied, they need to respond more autonomously, flexibly and robustly. We use the iCub humanoid robot, recently equipped with whole-body tactile sensing, to address these challenges by developing new methods to (i) autonomously acquire and adapt models of the complete robot’s body; (ii) move in unknown cluttered environments using whole-body awareness, thus preserving the robot and other people’s safety.
The goal of the KoroiBot project is to enhance the ability of humanoid robots to walk in a dynamic and versatile fashion in the way humans do. Research and innovation work in KoroiBot will mainly target novel motion control methods for existing hardware, but it will also derive optimized design principles for next generation robots.By doing so, KoroiBot addresses the ambitious goals set for the humanoid robots of the 21st century which are supposed to work and replace humans e.g. in households, disaster sites or space missions but which still lack the very fundamental ability to walk in a human-like fashion. Compared to all the intelligence that humanoids have to acquire to perform these tasks, the demand for an improved walking performance seems simple, but it is in fact very challenging, and the motion abilities of contemporary humanoids are still far behind their human role models.
The NEUROTECH project aims to (1) create a comprehensive online platform that will integrate information about all involved parties, the state of the art, and showcase latest technology advancement strengthening the NCT community and increasing its impact and visibility; (2) actively reach out to industry stakeholders to get them involved in shaping future research directions and to facilitate uptake of existing technology; (3) promote public interest in NCT by show-casting key technologies; and (4) shape educational resources in this interdisciplinary field targeting students with different backgrounds.
NeuTouch aims at improving artificial tactile perception in robots and prostheses, by understanding how to best extract, integrate and exploit tactile information at system level. To this aim, NeuTouch will train young researchers that will build up a novel multidisciplinary community, that will tackle fundamental questions about neural encoding of tactile information, by developing computational tools and models capable of explaining the activity in the biological neural pathway and link it to behavioral decisions.
Reproducing an act with sensorimotor means and using fine natural language for communicating the intentionality behind the act is what Aristotle called “Poetics”. POETICON is a research and development project that explores the “poetics of everyday life”, i.e. the synthesis of sensorimotor representations and natural language in everyday human interaction. This is related to an old problem in Artificial Intelligence on how meaning emerges, which is approached here in a new way.
Cognitive Systems and Robotics Handling novel situations beyond learned schemas or set behaviours is still a quest in engineering cognitive and embodied systems. Powerful generalisation mechanisms are necessary for any agent to operate effectively in real-world environments. POETICON++ suggests that natural language can be used as a learning tool for generalisation of learned behaviours and perceptual experiences and generation of new behaviours and experiences (creativity). The main objective of POETICON++ is the development of a computational mechanism for such generalisation of motor programs and visual experiences for robots. To this end, it will integrate natural language and visual action/object recognition tools with motor skills and learning abilities.
RobotCub, our flagship project, IIT became partner in 2007 and main actor in the development of the humanoid robot. RobotCub is a 5 years long project funded by the European Commission through Unit E5 "Cognitive Systems, Interaction & Robotics" (now simply "robotics"). Our main goal is to study cognition through the implementation of a humanoid robot the size of a 3.5 year old child: the iCub. This is an open project in many different ways: we distribute the platform openly, we develop software open-source, and we are open to including new partners and form collaboration worldwide.
The ROBOT-DOC proposal aims at the establishment of an open, multi-national doctoral training network for the interdisciplinary training on developmental cognitive robotics. Developmental robotics is a novel approach to the design of cognitive robots that takes direct inspiration from developmental mechanisms studied in children. The ROBOT-DOC Fellows will develop a balanced mix of domain-specific robotics research skills and of complementary transferrable skills for careers in academia and industry.
Roboskin will develop and demonstrate a range of new robot capabilities based on robot skin tactile feedback from large areas of the robot body. An investigation of these issues until now has been limited by the lack of tactile sensing technologies enabling large scale experimental activities, since so far skin technologies and embedded tactile sensors have been mostly demonstrated only at prototype stage. The new capabilities will improve the ability of robots to operate effectively and safely in unconstrained environments and also their ability to communicate and co-operate with each other and with humans.
Recently, humanoids have started to be employed in linguistic investigations with the dual aim of endowing robots with the capability to communicate and interact with humans, and further understanding the mechanisms underlying language development. The proposed research aims at creating a developmental cognitive model for the iCub humanoid robot for grounding the meaning of language in tool affordances. Despite it is firmly clear that language has to be grounded in sensorimotor experience, recently it has became increasingly evident the importance of going beyond simple sensorimotor grounding.
The goal of SCOPE is to contribute to the RobMoSys ecosystem by proposing methods and tools to enable the assessment of system-wide safety properties at the behavioral level (the “deliberative layer”) where safe autonomy becomes the key challenge. With reference to the RobMoSys meta-model for robotic behavior, the goal of SCOPE is to provide tools that analyze and derive properties of a task by composing the properties that describe its skills and the environment, and, at runtime, ensure the correct execution of a task by monitoring it and propagating anomalies detected at the level of the skills.
The EU-funded SOFTMANBOT project will provide an innovative and holistic robotic system for the handling of flexible and deformable materials within labour-intensive production processes. Its system will include smart dexterous grippers that will enable grasping and manipulation skills with integrated sensors (mainly tactile). The project will test the technology in industrially relevant environments in four key manufacturing sectors (toy, textile, footwear and tyre).
The SECURE project aims to train a new generation of researchers on safe cognitive robot concepts for human work and living spaces on the most advanced humanoid robot platforms available in Europe. The fellows will be trained through an innovative concept of project-based learning and constructivist learning in supervised peer networks where they will gain experience from an intersectoral programme involving universities, research institutes, large and SME companies from public and private sectors. The training domain will integrate multidisciplinary concepts from the fields of cognitive human-robot interaction, computer science and intelligent robotics where a new approach of integrating principles of embodiment, situation and interaction will be pursued to address future challenges for safe human-robot environment.
TACMAN addresses the key problem of developing an information processing and control technology enabling robot hands to exploit tactile sensitivity and thus become as dexterous as human hands. The current availability of the required technology now allows us to considerably advance in-hand manipulation. TACMAN's goal is to develop fundamentally new approaches which can replace manual labor under inhumane conditions by endowing robots with such tactile manipulation abilities, by transferring insights from human neuroscientific studies into machine learning algorithms.
The European Robotics Research Infrastructures Network (TERRINet) project aims at building a world-class network with harmonised services and complementary capabilities where talented researchers from academia and industry worldwide will have access and will be able to explore new ideas and establish personal and joint projects; to get in contact with and be inspired by leading and creative scientists, technologists, experts and industrial representatives; to share information and gain knowledge for boosting their scientific research and potential for technological innovation.
The WALK-MAN project aims to develop a humanoid robot that can operate in buildings that were damaged following natural and man-made disasters. The robot will demonstrate new skills: -Dextrous, powerful manipulation skills - e.g. turning a heavy valve of lifting collapsed masonry, -robust balanced locomotion - walking, crawling over a debris pile, and - physical sturdiness - e.g. operating conventional hand tools such as pneumatic drills or cutters. In addition, the robot will have the sufficient cognitive ability to allow it to operate autonomously or under tele-operation in case of severe communication limitations for remote control due to limited channel bandwidth and/or reliability). The robot will show human levels of locomotion, balance and manipulation and operate outside the laboratory environment. Disaster sites may include buildings such as factories, offices, houses.
The What You Say Is What You Did project (WYSIWYD) will create a new transparency in human robot interaction (HRI) by allowing robots to both understand their own actions and those of humans, and to interpret and communicate these in human compatible intentional terms expressed as a language-like communication channel we call WYSIWYD Robotese (WR). WYSIWYD will advance this critical communication channel following a biologically and psychologically grounded developmental perspective allowing the robot to acquire, retain and express WR dependent on its individual interaction history.
Structural bootstrapping is a method of building generative models, leveraging existing experience to predict unexplored action effects and to focus the hypothesis space for learning novel concepts. Xperience will demonstrate that state‐of‐the‐art enactive systems can be significantly extended by using structural bootstrapping to generate new knowledge. This process is founded on explorative knowledge acquisition, and subsequently validated through experience‐based generalization. Xperience will implement, adapt, and extend a complete robot system for automating introspective, predictive, and interactive understanding of actions and dynamic situations. Xperience will evaluate, and benchmark this approach on existing state‐of‐the‐art humanoid robots, integrating the different components into a complete system that can interact with humans.