EN FR
EN FR


Section: Partnerships and Cooperations

European Initiatives

FP7 & H2020 Projects

FP7 Space RemoveDEBRIS

Participants : Eric Marchand, François Chaumette.

  • Instrument: Specific Targeted Research Project

  • Duration: October 2013 - March 2019

  • Coordinator: University of Surrey (United Kingdom)

  • Partners: Surrey Satellite Technology (United Kingdom), Airbus (Toulouse, France and Bremen, Germany), Isis (Delft, The Netherlands), CSEM (Neuchâtel, Switzerland), Stellenbosch University (South Africa).

  • Inria contact: François Chaumette

  • Abstract: A huge amount of debris have progressively been generated since the beginning of the space era. Most of the objects launched into space are still orbiting the Earth and today these objects and their by-products represent a threat both in space and on Earth. In Space, debris lead to collisions and therefore to damages to operational satellites. For both issues, a credible solution has emerged over the recent years: actively removing heavy debris objects by capturing them and then either disposing them by destructive re-entry in Earth atmosphere or disposing them in graveyard orbits. The RemoveDEBRIS project aimed to demonstrate key technologies for ADR in three main domains by performing in-orbit demonstrations representative of an ADR mission. The specific key technologies that have been demonstrated as part of this project are: (i) Capture technologies such as nets and harpoons (ii) De-orbiting technologies such as electric propulsion and drag augmentation (iii) Proximity Rendezvous operations technologies based on vision-based navigation. The technology demonstrations has been carried in orbit using a micro satellite test-bed, a world’s first. The micro satellite has carried the ADR payloads together with two deployable nanosatellites (CubeSats). Through a series of operations, the nanosatellites have been ejected, re-captured, inspected and de-orbited, thereby demonstrating the ADR key technologies [16], [8], [7]. Our goal in this long project was to develop and validate model-based tracking algorithms on images acquired during the actual space debris removal mission [47].

H2020 ICT Comanoid

Participants : Fabien Spindler, François Chaumette.

  • Title: Multi-contact Collaborative Humanoids in Aircraft Manufacturing

  • Programme: H2020

  • Duration: January 2015 - February 2019

  • Coordinator: CNRS (Lirmm)

  • Partners: Airbus Group (France), DLR (Germany), Università Degli Studi di Roma La Sapienza (Italy), CNRS (I3S)

  • Inria contact: François Chaumette

  • Abstract: Comanoid investigated the deployment of robotic solutions in well-identified Airbus airliner assembly operations that are laborious or tedious for human workers and for which access is impossible for wheeled or rail-ported robotic platforms. As a solution to these constraints a humanoid robot was proposed to achieve the described tasks in real-use cases provided by Airbus Group. At a first glance, a humanoid robotic solution appears extremely risky, since the operations to be conducted are in highly constrained aircraft cavities with non-uniform (cargo) structures. Furthermore, these tight spaces are to be shared with human workers. Recent developments, however, in multi-contact planning and control suggested that this is a much more plausible solution than current alternatives such as a manipulator mounted on multi-legged base. Indeed, if humanoid robots can efficiently exploit their surroundings in order to support themselves during motion and manipulation, they can ensure balance and stability, move in non-gaited (acyclic) ways through narrow passages, and also increase operational forces by creating closed-kinematic chains. Bipedal robots are well suited to narrow environments specifically because they are able to perform manipulation using only small support areas. Moreover, the stability benefits of multi-legged robots that have larger support areas are largely lost when the manipulator must be brought close, or even beyond, the support borders. COMANOID aimed at assessing clearly how far the state-of-the-art stands from such novel technologies. In particular the project focused on implementing a real-world humanoid robotics solution using the best of research and innovation. The main challenge was to integrate current scientific and technological advances including multi-contact planning and control; advanced visual-haptic servoing; perception and localization; human-robot safety, and the operational efficiency of cobotics solutions in airliner manufacturing [21].

H2020 ICT CrowdBot

Participants : Javad Amirian, Fabien Grzeskowiak, Solenne Fortun, Marie Babel, Julien Pettré, Fabien Spindler.

  • Title: Robot navigation in dense crowds

  • Programme: H2020

  • Duration: Jan 2018 - Jun 2021

  • Coordinator: Inria

  • Partners: UCL (UK), SoftBank Robotics (France), Univ. Aachen (Germany), EPFL (Switzerland), ETHZ (Switzerland), Locomotec (Germany)

  • Inria contact: Julien Pettré

  • Abstract: CROWDBOT will enable mobile robots to navigate autonomously and assist humans in crowded areas. Today’s robots are programmed to stop when a human, or any obstacle is too close, to avoid coming into contact while moving. This prevents robots from entering densely frequented areas and performing effectively in these high dynamic environments. CROWDBOT aims to fill in the gap in knowledge on close interactions between robots and humans during navigation tasks. The project considers three realistic scenarios: 1) a semi-autonomous wheelchair that must adapt its trajectory to unexpected movements of people in its vicinity to ensure neither its user nor the pedestrians around it are injured; 2) the commercially available Pepper robot that must navigate in a dense crowd while actively approaching people to assist them; 3) the under development robot cuyBot will adapt to compact crowd, being touched and pushed by people. These scenarios generate numerous ethical and safety concerns which this project addresses through a dedicated Ethical and Safety Advisory Board that will design guidelines for robots engaging in interaction in crowded environments. CROWDBOT gathers the required expertise to develop new robot capabilities to allow robots to move in a safe and socially acceptable manner. This requires achieving step changes in a) sensing abilities to estimate the crowd motion around the robot, b) cognitive abilities for the robot to predict the short term evolution of the crowd state and c) navigation abilities to perform safe motion at close range from people. Through demonstrators and open software components, CROWDBOT will show that safe navigation tasks can be achieved within crowds and will facilitate incorporating its results into mobile robots, with significant scientific and industrial impact. By extending the robot operation field toward crowded environments, we enable possibilities for new applications, such as robot-assisted crowd traffic management.

H2020 ICT PRESENT

Participants : Adèle Colas, Alberto Jovane, Claudio Pacchierotti, Julien Pettré.

  • Title: Photoreal REaltime Sentient ENTity

  • Programme: H2020

  • Duration: Sep 2019 - Aug 2022

  • Coordinator: Univ Pompeu Fabra (Spain)

  • Partners: The Framestore Ltd (UK), Cubic Motion Ltd (UK), InfoCert Spa (Italy), Brainstorm Multimedia S.L. (ES), Creative Workers - Creatieve Werkers VZW (Belgium), Universitaet Augsburg (Germany), Inria (France)

  • Inria contact: Julien Pettré

  • Abstract: PRESENT is a three-year Research and Innovation project to create virtual digital companions––embodied agents––that look entirely naturalistic, demonstrate emotional sensitivity, can establish meaningful dialogue, add sense to the experience, and act as trustworthy guardians and guides in the interfaces for AR, VR and more traditional forms of media.

    There is no higher quality interaction than the human experience when we use all our senses together with language and cognition to understand our surroundings and––above all—to interact with other people. We interact with today’s Ïntelligent Personal Assistantsp̎rimarily by voice; communication is episodic, based on a request-response model. The user does not see the assistant, which does not take advantage of visual and emotional clues or evolve over time. However, advances in the real-time creation of photorealistic computer generated characters, coupled with emotion recognition and behaviour, and natural language technologies, allow us to envisage virtual agents that are realistic in both looks and behaviour; that can interact with users through vision, sound, touch and movement as they navigate rich and complex environments; converse in a natural manner; respond to moods and emotional states; and evolve in response to user behaviour.

    PRESENT will create and demonstrate a set of practical tools, a pipeline and APIs for creating realistic embodied agents and incorporating them in interfaces for a wide range of applications in entertainment, media and advertising.

H2020 FET-OPEN H-Reality

Participants : Claudio Pacchierotti, Paolo Robuffo Giordano, François Chaumette.

  • Title: Mixed Haptic Feedback for Mid-Air Interactions in Virtual and Augmented Realities

  • Programme: H2020

  • Duration: October 2018 - September 2021

  • Coordinator: Univ. Birmingham (UK)

  • Partners: Univ. Birmingham (UK, coordinator), CNRS (France), TU Delft (NL), Ultrahaptics (UK) and Actronika SAS (France)

  • CNRS contact: Claudio Pacchierotti

  • Abstract: Digital content today remains focused on visual and auditory stimulation. Even in the realm of VR and AR, sight and sound remain paramount. In contrast, methods for delivering haptic (sense of touch) feedback in commercial media are significantly less advanced than graphical and auditory feedback. Yet without a sense of touch, experiences ultimately feel hollow, virtual realities feel false, and Human-Computer Interfaces become unintuitive. Our vision is to be the first to imbue virtual objects with a physical presence, providing a revolutionary, untethered, virtual-haptic reality: H-Reality. The ambition of H-Reality will be achieved by integrating the commercial pioneers of ultrasonic “non-contact” haptics, state-of-the-art vibrotactile actuators, novel mathematical and tribological modelling of the skin and mechanics of touch, and experts in the psychophysical rendering of sensation. The result will be a sensory experience where digital 3D shapes and textures are made manifest in real space via modulated, focused, ultrasound, ready for the untethered hand to feel, where next-generation wearable haptic rings provide directional vibrotactile stimulation, informing users of an object's dynamics, and where computational renderings of specific materials can be distinguished via their surface properties. The implications of this technology will be far-reaching. The computer touch-screen will be brought into the third dimension so that swipe gestures will be augmented with instinctive rotational gestures, allowing intuitive manipulation of 3D data sets and strolling about the desktop as a virtual landscape of icons, apps and files. H-Reality will transform online interactions; dangerous machinery will be operated virtually from the safety of the home, and surgeons will hone their skills on thin air. Rainbow is involved in H-Reality in cooperation with Anatole Lécuyer and Maud Marchal from the Hybrid group.

Collaborations in European Programs, Except FP7 & H2020

Interreg Adapt

Participants : Nicolas Le Borgne, Marie Babel.

  • Programme: Interreg VA France (Channel) England

  • Project acronym: Adapt

  • Project title: Assistive Devices for empowering disAbled People through robotic Technologies

  • Duration: Jan 2017 - Jun 2021

  • Coordinator: ESIGELEC/IRSEEM Rouen

  • Other partners: INSA Rennes - IRISA, LGCGM, IETR (France), Université de Picardie Jules Verne - MIS (France), Pôle Saint Hélier (France), CHU Rouen (France), Réseau Breizh PC (France), Pôle TES (France), University College of London - Aspire CREATE (UK), University of Kent (UK), East Kent Hospitals Univ NHS Found. Trust (UK), Health and Europe Centre (UK), Plymouth Hospitals NHS Trust (UK), Canterbury Christ Church University (UK), Kent Surrey Sussex Academic Health Science Network (UK), Cornwall Mobility Center (UK).

  • Abstract: This project aims to develop innovative assistive technologies in order to support the autonomy and to enhance the mobility of power wheelchair users with severe physical/cognitive disabilities. In particular, the objective is to design and evaluate a power wheelchair simulator as well as to design a multi-layer driving assistance system.

Collaborations with Major European Organizations

ANR Opmops

Participants : Florian Berton, Julen Bruneau, Julien Pettré.

  • Programme: ANR

  • Project acronym: Opmops

  • Project title: Organized Pedestrian Movement in Public Spaces: Preparation and Crisis Management of Urban Parades and Demonstration Marches with High Conflict Potential

  • Duration: June 2017 - June 2020

  • Coordinator: Université de Haute Alsace (for France), Technische Universität Kaiserslautern (for Germany)

  • Other partners: Gendarmerie Nationale, Hochschule München, ONHYS S.A.S, Polizei Rheinland-Pfalz, Universität Koblenz-Landau, VdS GmbH

  • Abstract: This project is about parades of highly controversial groups or of political demonstration marches that are considered as a major threat to urban security. Due to the movement of the urban parades and demonstration marches (in the following abbreviated by UPM) through large parts of cities and the resulting space and time dynamics, it is particularly difficult for forces of civil security (abbreviated in the following by FCS) to guarantee safety at these types of urban events without endangering one of the most important indicators of a free society. In this proposal, partners representing the FCS (police and industry) will cooperate with researchers from academic institutions to develop a decision support tool which can help them both in the preparation phase and crisis management situations of UPMs. Specific technical issues which the French-German consortium will have to tackle include the following: Optimization methods to plan UPM routes, transportation to and from the UPM, location and personnel planning of FCS, control of UPMs using stationary and moving cameras, and simulation methods, including their visualization, with specific emphasis on social behavior.

iProcess

Participants : Agniva Sengupta, François Chaumette, Alexandre Krupa, Eric Marchand, Fabien Spindler.

  • Project acronym: i-Process

  • Project title: Innovative and Flexible Food Processing Technology in Norway

  • Duration: January 2016 - December 2019

  • Coordinator: Sintef Ocean (Norway)

  • Other partners: Nofima, Univ. of Stavanger, NMBU, NTNU (Norway), DTU (Denmark), KU Leuven (Belgium), and about 10 Norwegian companies.

  • Abstract: This project was granted by the Norwegian Government. Its main objective was to develop novel concepts and methods for flexible and sustainable food processing in Norway. In the scope of this project, the Rainbow group was involved for visual tracking and visual servoing of generic and potentially deformable objects (see Section 6.1.1 and Section 6.1.2). This year, we published [52], [53] in the scope of this project.

GentleMAN

Participants : Alexandre Krupa, Eric Marchand, François Chaumette, Fabien Spindler.

  • Project acronym: GentleMAN

  • Project title: Gentle and Advanced Robotic Manipulation of 3D Compliant Objects

  • Duration: August 2019 - December 2023

  • Coordinator: Sintef Ocean (Norway)

  • Other partners: NTNU (Norway), NMBU (Norway), MIT (USA) and QUT (Australia).

  • Abstract: This project is funded by the Norwegian Government. Its main objective is to develop a novel learning framework that uses visual, force and tactile sensing to develop new multi-modal learning models, interfaced with underlying robot control, for enabling robots to learn new and advanced skills for the manipulation of 3D compliant objects. In the scope of this project, the Rainbow group is involved in the elaboration of new approaches for visual tracking of deformable objects, active vision and visual servoing for deforming soft objects into desired shapes.