EN FR
EN FR


Section: Partnerships and Cooperations

European Initiatives

FP7 & H2020 Projects

IMAGINE

Participants : Maud Marchal [contact] , Thierry Gaugry, Romain Lagneau, Antonin Bernardin.

  • Title: IMAGINE - Robots Understanding Their Actions by Imagining Their Effects

  • Programm: H2020

  • Duration: January 2017 - December 2020

  • Coordinator: Univ. Innsbruck (Austria)

  • Partners:

    • Univ. Innsbruck (Austria)

    • Univ. Göttingen (Germany)

    • Karlsruhe Institute of Technology (Germany)

    • INSA Rennes (France)

    • Institute of Robotics and Industrial Informatics (Spain)

    • Univ. Bogazici (Turkey)

    • Electro Cycling (Germany)

  • Inria contact: Maud Marchal

  • Abstract: Today's robots are good at executing programmed motions, but they do not understand their actions in the sense that they could automatically generalize them to novel situations or recover from failures. IMAGINE seeks to enable robots to understand the structure of their environment and how it is affected by its actions. “Understanding” here means the ability of the robot (a) to determine the applicability of an action along with parameters to achieve the desired effect, and (b) to discern to what extent an action succeeded, and to infer possible causes of failure and generate recovery actions. The core functional element is a generative model based on an association engine and a physics simulator. “Understanding” is given by the robot's ability to predict the effects of its actions, before and during their execution. This allows the robot to choose actions and parameters based on their simulated performance, and to monitor their progress by comparing observed to simulated behavior. This scientific objective is pursued in the context of recycling of electromechanical appliances. Current recycling practices do not automate disassembly, which exposes humans to hazardous materials, encourages illegal disposal, and creates significant threats to environment and health, often in third countries. IMAGINE will develop a TRL-5 prototype that can autonomously disassemble prototypical classes of devices, generate and execute disassembly actions for unseen instances of similar devices, and recover from certain failures. For robotic disassembly, IMAGINE will develop a multi-functional gripper capable of multiple types of manipulation without tool changes. IMAGINE raises the ability level of robotic systems in core areas of the work programme, including adaptability, manipulation, perception, decisional autonomy, and cognitive ability. Since only one-third of EU e-waste is currently recovered, IMAGINE addresses an area of high economical and ecological impact.

H-REALITY

Participants : Anatole Lécuyer, Maud Marchal [contact] , Thomas Howard, Gerard Gallagher.

  • Title: H-REALITY

  • Programm: H2020 - Fet Open

  • Duration: 2018 - 2021

  • Coordinator: Univ. Birmingham (UK)

  • Partners:

    • Univ. Birmingham (UK)

    • CNRS (France),

    • TU Delft (Netherlands),

    • ACTRONIKA (France),

    • ULTRAHAPTICS (UK)

  • Inria contact: Maud Marchal

  • Abstract: The vision of H-REALITY is to be the first to imbue virtual objects with a physical presence, providing a revolutionary, untethered, virtual-haptic reality: H-Reality. This ambition will be achieved by integrating the commercial pioneers of ultrasonic “non-contact” haptics, state-of-the-art vibrotactile actuators, novel mathematical and tribological modelling of the skin and mechanics of touch, and experts in the psychophysical rendering of sensation. The result will be a sensory experience where digital 3D shapes and textures are made manifest in real space via modulated, focused, ultrasound, ready for the unteathered hand to feel, where next-generation wearable haptic rings provide directional vibrotactile stimulation, informing users of an object's dynamics, and where computational renderings of specific materials can be distinguished via their surface properties. The implications of this technology will transform online interactions; dangerous machinery will be operated virtually from the safety of the home, and surgeons will hone their skills on thin air.

TACTILITY

Participants : Ferran Argelaguet [contact] , Anatole Lécuyer, Maud Marchal, Sebastian Vizcay.

  • Title: Tactility

  • Programm: H2020 - ICT 25

  • Duration: July 2019 - June 2022

  • Coordinator: Fundación Tecnalia Research and Innovation (Spain)

  • Partners:

    • Aalborg University (Netherlands)

    • Universita Degli Studi di Genova (Itali),

    • Tecnalia Servia (Servia),

    • Universitat de Valencia (Spain),

    • Manus Machinae B.V. (Netherlands),

    • Smartex S.R.L (Italy),

    • Immersion (France)

  • Inria contact: Ferran Argelaguet

  • Abstract: TACTILITY is a multidisciplinary innovation and research action with the overall aim of including rich and meaningful tactile information into the novel interaction systems through technology for closed-loop tactile interaction with virtual environments. By mimicking the characteristics of the natural tactile feedback, it will substantially increase the quality of immersive VR experience used locally or remotely (tele-manipulation). The approach is based on transcutaneous electro-tactile stimulation delivered through electrical pulses with high resolution spatio-temporal distribution. To achieve it, significant development of technologies for transcutaneous stimulation, textile-based multi-pad electrodes and tactile sensation electronic skin, coupled with ground-breaking research of perception of elicited tactile sensations in VR, is needed. The key novelty is in the combination of: 1) the ground-breaking research of perception of electrotactile stimuli for the identification of the stimulation parameters and methods that evoke natural like tactile sensations, 2) the advanced hardware, that will integrate the novel high-resolution electrotactile stimulation system and state of the art artificial electronic skin patches with smart textile technologies and VR control devices in a wearable mobile system, and 3) the novel firmware, that handles real-time encoding and transmission of tactile information from virtual objects in VR, as well as from the distant tactile sensors (artificial skins) placed on robotic or human hands. Proposed research and innovation action would result in a next generation of interactive systems with higher quality experience for both local and remote (e.g., tele-manipulation) applications. Ultimately, TACTILITY will enable high fidelity experience through low-cost, user friendly, wearable and mobile technology.

Interreg ADAPT

Participants : Valérie Gouranton [contact] , Bruno Arnaldi, Ronan Gaugne, Florian Nouviale, Yoren Gaffary, Alexandre Audinot.

  • Program: Interreg VA France (Channel) England

  • Project acronym: ADAPT

  • Project title:Assistive Devices for empowering disAbled People through robotic Technologies

  • Duration: 01/2017 - 06/2021

  • Coordinator: ESIGELEC/IRSEEM Rouen

  • Other partners: INSA Rennes - IRISA, LGCGM, IETR (France), Université de Picardie Jules Verne - MIS (France), Pôle Saint Hélier (France), CHU Rouen (France), Réseau Breizh PC (France), Ergovie (France), Pôle TES (France), University College of London - Aspire CREATE (UK), University of Kent (UK), East Kent Hospitals Univ NHS Found. Trust (UK), Health and Europe Centre (UK), Plymouth Hospitals NHS Trust (UK), Canterbury Christ Church University (UK), Kent Surrey Sussex Academic Health Science Network (UK), Cornwall Mobility Center (UK).

  • Inria contact: Valérie Gouranton

  • Abstract: The ADAPT project aims to develop innovative assistive technologies in order to support the autonomy and to enhance the mobility of power wheelchair users with severe physical/cognitive disabilities. In particular, the objective is to design and evaluate a power wheelchair simulator as well as to design a multi-layer driving assistance system.

Collaboration with Rainbow team.