EN FR
EN FR
2021
Activity report
Project-Team
HYBRID
RNSR: 201322122U
In partnership with:
Institut national des sciences appliquées de Rennes, Université Rennes 1
Team name:
3D interaction with virtual environments using body and mind
In collaboration with:
Institut de recherche en informatique et systèmes aléatoires (IRISA)
Domain
Perception, Cognition and Interaction
Theme
Interaction and visualization
Creation of the Project-Team: 2013 July 01

Keywords

Computer Science and Digital Science

  • A2.5. Software engineering
  • A5.1. Human-Computer Interaction
  • A5.1.1. Engineering of interactive systems
  • A5.1.2. Evaluation of interactive systems
  • A5.1.3. Haptic interfaces
  • A5.1.4. Brain-computer interfaces, physiological computing
  • A5.1.5. Body-based interfaces
  • A5.1.6. Tangible interfaces
  • A5.1.7. Multimodal interfaces
  • A5.1.8. 3D User Interfaces
  • A5.1.9. User and perceptual studies
  • A5.2. Data visualization
  • A5.6. Virtual reality, augmented reality
  • A5.6.1. Virtual reality
  • A5.6.2. Augmented reality
  • A5.6.3. Avatar simulation and embodiment
  • A5.6.4. Multisensory feedback and interfaces
  • A5.10.5. Robot interaction (with the environment, humans, other robots)
  • A6. Modeling, simulation and control
  • A6.2. Scientific computing, Numerical Analysis & Optimization
  • A6.3. Computation-data interaction

Other Research Topics and Application Domains

  • B1.2. Neuroscience and cognitive science
  • B2.4. Therapies
  • B2.5. Handicap and personal assistances
  • B2.6. Biological and medical imaging
  • B2.8. Sports, performance, motor skills
  • B5.1. Factory of the future
  • B5.2. Design and manufacturing
  • B5.8. Learning and training
  • B5.9. Industrial maintenance
  • B6.4. Internet of things
  • B8.1. Smart building/home
  • B8.3. Urbanism and urban planning
  • B9.1. Education
  • B9.2. Art
  • B9.2.1. Music, sound
  • B9.2.2. Cinema, Television
  • B9.2.3. Video games
  • B9.4. Sports
  • B9.6.6. Archeology, History

1 Team members, visitors, external collaborators

Research Scientists

  • Anatole Lécuyer [Team leader, Inria, Senior Researcher, HDR]
  • Fernando Argelaguet Sanz [Inria, Researcher, HDR]

Faculty Members

  • Bruno Arnaldi [INSA Rennes, Professor, HDR]
  • Valérie Gouranton [INSA Rennes, Associate Professor]
  • Melanie Villain [Univ de Rennes I, Associate Professor]

Post-Doctoral Fellows

  • Elodie Bouzbib [Inria, from Dec 2021, Co-supervised with Rainbow]
  • Panagiotis Kourtesis [Inria]
  • Justine Saint-Aubert [Inria, until Oct 2021]

PhD Students

  • Hugo Brument [Univ de Rennes I]
  • Antonin Cheymol [Inria, from Nov 2021]
  • Diane Dewez [Inria, until Nov 2021]
  • Gwendal Fouché [Inria]
  • Vincent Goupil [Vinci Construction, CIFRE]
  • Lysa Gramoli [Orange Labs, CIFRE]
  • Gabriela Herrera Altamira [Inria, Co-supervised with Loria]
  • Emilie Hummel [Inria, from Oct 2021]
  • Salome Le Franc [Centre hospitalier régional et universitaire de Rennes]
  • Tiffany Luong [Institut de recherche technologique B-com, CIFRE, until Jan 2021]
  • Mae Mavromatis [Inria, from Oct 2021]
  • Victor Rodrigo Mercado Garcia [Inria]
  • Yann Moullec [Univ de Rennes I, from Oct 2021]
  • Sebastian Vizcay [Inria]

Technical Staff

  • Alexandre Audinot [INSA Rennes, Engineer]
  • Ronan Gaugne [Univ de Rennes I, Engineer, Research engineer, Technical director of Immersia]
  • Florian Nouviale [INSA Rennes, Engineer]
  • Justine Saint-Aubert [Inria, Engineer, from Oct 2021]

Interns and Apprentices

  • Mehdi Aouichat [INSA Rennes, from Jun 2021 until Aug 2021]
  • Cyril Canillas [Inria, from Apr 2021 until Sep 2021]
  • Julien Cauquis [Inria, from Apr 2021 until Sep 2021]
  • Alexandre Chapin [INSA Rennes, from Jun 2021 until Sep 2021]
  • Pierre Duc-Martin [INSA Rennes, from May 2021 until Sep 2021]
  • Mathieu Godineau [INSA Rennes, from Jul 2021 until Aug 2021]
  • Maud Guillen [Univ de Rennes I, until Oct 2021]
  • Mika Inisan [Inria, from Apr 2021 until Aug 2021]
  • Julien Manson [Inria, from May 2021 until Jul 2021]
  • Tangui Marchand-Guerniou [INSA Rennes, from May 2021 until Sep 2021]
  • Mae Mavromatis [Inria, from Feb 2021 until Jul 2021]
  • Yann Moullec [Inria, from Feb 2021 until Jul 2021]

Administrative Assistant

  • Nathalie Denis [Inria]

Visiting Scientists

  • Flavien Lecuyer [Univ de Strasbourg, Jul 2021]
  • Charlotte Sabaux [Ghent University, Jul 2021]

External Collaborators

  • Guillaume Moreau [École centrale de Nantes, until Jun 2021, HDR]
  • Jean Marie Normand [École centrale de Nantes, until Jun 2021, HDR]

2 Overall objectives

Our research project belongs to the scientific field of Virtual Reality (VR) and 3D interaction with virtual environments. VR systems can be used in numerous applications such as for industry (virtual prototyping, assembly or maintenance operations, data visualization), entertainment (video games, theme parks), arts and design (interactive sketching or sculpture, CAD, architectural mock-ups), education and science (physical simulations, virtual classrooms), or medicine (surgical training, rehabilitation systems). A major change that we foresee in the next decade concerning the field of Virtual Reality relates to the emergence of new paradigms of interaction (input/output) with Virtual Environments (VE).

As for today, the most common way to interact with 3D content still remains by measuring user's motor activity, i.e., his/her gestures and physical motions when manipulating different kinds of input device. However, a recent trend consists in soliciting more movements and more physical engagement of the body of the user. We can notably stress the emergence of bimanual interaction, natural walking interfaces, and whole-body involvement. These new interaction schemes bring a new level of complexity in terms of generic physical simulation of potential interactions between the virtual body and the virtual surrounding, and a challenging "trade-off" between performance and realism. Moreover, research is also needed to characterize the influence of these new sensory cues on the resulting feelings of "presence" and immersion of the user.

Besides, a novel kind of user input has recently appeared in the field of virtual reality: the user's mental activity, which can be measured by means of a "Brain-Computer Interface" (BCI). Brain-Computer Interfaces are communication systems which measure user's electrical cerebral activity and translate it, in real-time, into an exploitable command. BCIs introduce a new way of interacting "by thought" with virtual environments. However, current BCI can only extract a small amount of mental states and hence a small number of mental commands. Thus, research is still needed here to extend the capacities of BCI, and to better exploit the few available mental states in virtual environments.

Our first motivation consists thus in designing novel “body-based” and “mind-based” controls of virtual environments and reaching, in both cases, more immersive and more efficient 3D interaction.

Furthermore, in current VR systems, motor activities and mental activities are always considered separately and exclusively. This reminds the well-known “body-mind dualism” which is at the heart of historical philosophical debates. In this context, our objective is to introduce novel “hybrid” interaction schemes in virtual reality, by considering motor and mental activities jointly, i.e., in a harmonious, complementary, and optimized way. Thus, we intend to explore novel paradigms of 3D interaction mixing body and mind inputs. Moreover, our approach becomes even more challenging when considering and connecting multiple users which implies multiple bodies and multiple brains collaborating and interacting in virtual reality.

Our second motivation consists thus in introducing a “hybrid approach” which will mix mental and motor activities of one or multiple users in virtual reality.

3 Research program

The scientific objective of Hybrid team is to improve 3D interaction of one or multiple users with virtual environments, by making full use of physical engagement of the body, and by incorporating the mental states by means of brain-computer interfaces. We intend to improve each component of this framework individually, but we also want to improve the subsequent combinations of these components.

The "hybrid" 3D interaction loop between one or multiple users and a virtual environment is depicted in Figure 1. Different kinds of 3D interaction situations are distinguished (red arrows, bottom): 1) body-based interaction, 2) mind-based interaction, 3) hybrid and/or 4) collaborative interaction (with at least two users). In each case, three scientific challenges arise which correspond to the three successive steps of the 3D interaction loop (blue squares, top): 1) the 3D interaction technique, 2) the modeling and simulation of the 3D scenario, and 3) the design of appropriate sensory feedback.

Figure 1
Figure 1: 3D hybrid interaction loop between one or multiple users and a virtual reality system. Top (in blue) three steps of 3D interaction with a virtual environment: (1-blue) interaction technique, (2-blue) simulation of the virtual environment, (3-blue) sensory feedbacks. Bottom (in red) different cases of interaction: (1-red) body-based, (2-red) mind-based, (3-red) hybrid, and (4-red) collaborative 3D interaction.

The 3D interaction loop involves various possible inputs from the user(s) and different kinds of output (or sensory feedback) from the simulated environment. Each user can involve his/her body and mind by means of corporal and/or brain-computer interfaces. A hybrid 3D interaction technique (1) mixes mental and motor inputs and translates them into a command for the virtual environment. The real-time simulation (2) of the virtual environment is taking into account these commands to change and update the state of the virtual world and virtual objects. The state changes are sent back to the user and perceived by means of different sensory feedbacks (e.g., visual, haptic and/or auditory) (3). The sensory feedbacks are closing the 3D interaction loop. Other users can also interact with the virtual environment using the same procedure, and can eventually “collaborate” by means of “collaborative interactive techniques” (4).

This description is stressing three major challenges which correspond to three mandatory steps when designing 3D interaction with virtual environments:

  • 3D interaction techniques: This first step consists in translating the actions or intentions of the user (inputs) into an explicit command for the virtual environment. In virtual reality, the classical tasks that require such kinds of user command were early categorized in four 46: navigating the virtual world, selecting a virtual object, manipulating it, or controlling the application (entering text, activating options, etc). The addition of a third dimension, the use of stereoscopic rendering and the use of advanced VR interfaces make however inappropriate many techniques that proved efficient in 2D, and make it necessary to design specific interaction techniques and adapted tools. This challenge is here renewed by the various kinds of 3D interaction which are targeted. In our case, we consider various cases, with motor and/or cerebral inputs, and potentially multiple users.
  • Modeling and simulation of complex 3D scenarios: This second step corresponds to the update of the state of the virtual environment, in real-time, in response to all the potential commands or actions sent by the user. The complexity of the data and phenomena involved in 3D scenarios is constantly increasing. It corresponds for instance to the multiple states of the entities present in the simulation (rigid, articulated, deformable, fluids, which can constitute both the user’s virtual body and the different manipulated objects), and the multiple physical phenomena implied by natural human interactions (squeezing, breaking, melting, etc). The challenge consists here in modeling and simulating these complex 3D scenarios and meeting, at the same time, two strong constraints of virtual reality systems: performance (real-time and interactivity) and genericity (e.g., multi-resolution, multi-modal, multi-platform, etc).
  • Immersive sensory feedbacks: This third step corresponds to the display of the multiple sensory feedbacks (output) coming from the various VR interfaces. These feedbacks enable the user to perceive the changes occurring in the virtual environment. They are closing the 3D interaction loop, making the user immersed, and potentially generating a subsequent feeling of presence. Among the various VR interfaces which have been developed so far we can stress two kinds of sensory feedback: visual feedback (3D stereoscopic images using projection-based systems such as CAVE systems or Head Mounted Displays); and haptic feedback (related to the sense of touch and to tactile or force-feedback devices). The Hybrid team has a strong expertice in haptic feedback, and in the design of haptic and “pseudo-haptic” rendering 47. Note that a major trend in the community, which is strongly supported by the Hybrid team, relates to a “perception-based” approach, which aims at designing sensory feedbacks which are well in line with human perceptual capacities.

These three scientific challenges are addressed differently according to the context and the user inputs involved. We propose to consider three different contexts, which correspond to the three different research axes of the Hybrid research team, namely: 1) body-based interaction (motor input only), 2) mind-based interaction (cerebral input only), and then 3) hybrid and collaborative interaction (i.e., the mixing of body and brain inputs from one or multiple users).

3.1 Research Axes

The scientific activity of Hybrid team follows three main axes of research:

  • Body-based interaction in virtual reality. Our first research axis concerns the design of immersive and effective "body-based" 3D interactions, i.e., relying on a physical engagement of the user’s body. This trend is probably the most popular one in VR research at the moment. Most VR setups make use of tracking systems which measure specific positions or actions of the user in order to interact with a virtual environment. However, in recent years, novel options have emerged for measuring “full-body” movements or other, even less conventional, inputs (e.g. body equilibrium). In this first research axis we are thus concerned by the emergence of new kinds of “body-based interaction” with virtual environments. This implies the design of novel 3D user interfaces and novel 3D interactive techniques, novel simulation models and techniques, and novel sensory feedbacks for body-based interaction with virtual worlds. It involves real-time physical simulation of complex interactive phenomena, and the design of corresponding haptic and pseudo-haptic feedback.
  • Mind-based interaction in virtual reality. Our second research axis concerns the design of immersive and effective “mind-based” 3D interactions in Virtual Reality. Mind-based interaction with virtual environments is making use of Brain-Computer Interface technology. This technology corresponds to the direct use of brain signals to send “mental commands” to an automated system such as a robot, a prosthesis, or a virtual environment. BCI is a rapidly growing area of research and several impressive prototypes are already available. However, the emergence of such a novel user input is also calling for novel and dedicated 3D user interfaces. This implies to study the extension of the mental vocabulary available for 3D interaction with VE, then the design of specific 3D interaction techniques "driven by the mind" and, last, the design of immersive sensory feedbacks that could help improving the learning of brain control in VR.
  • Hybrid and collaborative 3D interaction. Our third research axis intends to study the combination of motor and mental inputs in VR, for one or multiple users. This concerns the design of mixed systems, with potentially collaborative scenarios involving multiple users, and thus, multiple bodies and multiple brains sharing the same VE. This research axis therefore involves two interdependent topics: 1) collaborative virtual environments, and 2) hybrid interaction. It should end up with collaborative virtual environments with multiple users, and shared systems with body and mind inputs.

4 Application domains

4.1 Overview

The research program of Hybrid team aims at next generations of virtual reality and 3D user interfaces which could possibly address both the “body” and “mind” of the user. Novel interaction schemes are designed, for one or multiple users. We target better integrated systems and more compelling user experiences.

The applications of our research program correspond to the applications of virtual reality technologies which could benefit from the addition of novel body-based or mind-based interaction capabilities:

  • Industry: with training systems, virtual prototyping, or scientific visualization;
  • Medicine: with rehabilitation and reeducation systems, or surgical training simulators;
  • Entertainment: with movie industry, content customization, video games or attractions in theme parks,
  • Construction: with virtual mock-ups design and review, or historical/architectural visits.
  • Cultural Heritage: with acquisition, virtual excavation, virtual reconstruction and visualization

5 Social and environmental responsibility

5.1 Impact of research results

A salient initiative launched in 2020 and carried out by Hybrid is the Inria Covid-19 project “VERARE”. VERARE is a unique and innovative concept implemented in record time thanks to a close collaboration between the Hybrid research team and the teams from the intensive care and physical and rehabilitation medicine departments of Rennes University Hospital. VERARE consists in using virtual environments and VR technologies for the rehabilitation of Covid-19 patients, coming out of coma, weakened, and with strong difficulties in recovering walking. With VERARE, the patient is immersed in different virtual environments using a VR headset. He is represented by an “avatar”, carrying out different motor tasks involving his lower limbs, for example : walking, jogging, avoiding obstacles, etc. Our main hypothesis is that the observation of such virtual actions, and the progressive resumption of motor activity in VR, will allow a quicker start to rehabilitation, as soon as the patient leaves the ICU. The patient will then be able to carry out sessions in his room, or even from his hospital bed, in simple and secure conditions, hoping to obtain a final clinical benefit, either in terms of motor and walking recovery or in terms of hospital length of stay. The project started at the end of April 2020, and we could deploy a first version of our application at the Rennes hospital in mid-June 2020 only 2 months after the project started. Covid patients are now using our virtual reality application at the Rennes University Hospital, and the clinical evaluation of VERARE is expected to be achieved and completed in 2022.m

6 Highlights of the year

6.1 Awards

  • “Société Informatique de France - Gilles Khan” Best PhD Award First Accessit - PhD thesis Rebecca Fribourg
  • “Fondation Rennes 1” Best PhD Award - PhD thesis Rebecca Fribourg
  • “Séphora Berebi” Best PhD Award - PhD thesis Rebecca Fribourg
  • “GDR IGRV PhD Award” - Best PhD thesis Rebecca Fribourg
  • “Best paper award” at the conference ICAT-EGVE 2021 for the paper Electrotactile Feedback For Enhancing Contact Information in Virtual Reality 42.

7 New software and platforms

7.1 New software

7.1.1 #FIVE

  • Name:
    Framework for Interactive Virtual Environments
  • Keywords:
    Virtual reality, 3D, 3D interaction, Behavior modeling
  • Scientific Description:
    #FIVE (Framework for Interactive Virtual Environments) is a framework for the development of interactive and collaborative virtual environments. #FIVE was developed to answer the need for an easier and a faster design and development of virtual reality applications. #FIVE provides a toolkit that simplifies the declaration of possible actions and behaviours of objects in a VE. It also provides a toolkit that facilitates the setting and the management of collaborative interactions in a VE. It is compliant with a distribution of the VE on different setups. It also proposes guidelines to efficiently create a collaborative and interactive VE. The current implementation is in C# and comes with a Unity3D engine integration, compatible with MiddleVR framework.
  • Functional Description:
    #FIVE contains software modules that can be interconnected and helps in building interactive and collaborative virtual environments. The user can focus on domain-specific aspects for his/her application (industrial training, medical training, etc) thanks to #FIVE's modules. These modules can be used in a vast range of domains using virtual reality applications and requiring interactive environments and collaboration, such as in training for example.
  • URL:
  • Publication:
  • Contact:
    Valerie Gouranton
  • Participants:
    Florian Nouviale, Valerie Gouranton, Bruno Arnaldi, Vincent Goupil, Carl-Johan Jorgensen, Emeric Goga, Adrien Reuzeau, Alexandre Audinot

7.1.2 #SEVEN

  • Name:
    Sensor Effector Based Scenarios Model for Driving Collaborative Virtual Environments
  • Keywords:
    Virtual reality, Interactive Scenarios, 3D interaction
  • Scientific Description:
    #SEVEN (Sensor Effector Based Scenarios Model for Driving Collaborative Virtual Environments) is a model and an engine based on petri nets extended with sensors and effectors, enabling the description and execution of complex and interactive scenarios
  • Functional Description:
    #SEVEN enables the execution of complex scenarios for driving Virtual Reality applications. #SEVEN's scenarios are based on enhanced Petri net and state machine models which is able to describe and solve intricate event sequences. #SEVEN comes with an editor for creating, editing and remotely controlling and running scenarios. #SEVEN is implemented in C# and can be used as a stand-alone application or as a library. An integration to the Unity3D engine, compatible with MiddleVR, also exists.
  • Release Contributions:
    Adding state machine handling for scenario description in addition to the already existing petri net format. Improved scenario editor
  • URL:
  • Publications:
  • Contact:
    Valerie Gouranton
  • Participants:
    Florian Nouviale, Valerie Gouranton, Bruno Arnaldi, Vincent Goupil, Emeric Goga, Carl-Johan Jorgensen, Adrien Reuzeau, Alexandre Audinot

7.1.3 OpenVIBE

  • Keywords:
    Neurosciences, Interaction, Virtual reality, Health, Real time, Neurofeedback, Brain-Computer Interface, EEG, 3D interaction
  • Functional Description:
    OpenViBE is a free and open-source software platform devoted to the design, test and use of Brain-Computer Interfaces (BCI). The platform consists of a set of software modules that can be integrated easily and efficiently to design BCI applications. The key features of OpenViBE software are its modularity, its high performance, its portability, its multiple-user facilities and its connection with high-end/VR displays. The designer of the platform enables users to build complete scenarios based on existing software modules using a dedicated graphical language and a simple Graphical User Interface (GUI). This software is available on the Inria Forge under the terms of the AGPL licence, and it was officially released in June 2009. Since then, the OpenViBE software has already been downloaded more than 60000 times, and it is used by numerous laboratories, projects, or individuals worldwide. More information, downloads, tutorials, videos, documentations are available on the OpenViBE website.
  • Release Contributions:

    Python2 support dropped in favour of Python3 New feature boxes: - Riemannian geometry - Multimodal Graz visualisation - Artefact detection - Features selection - Stimulation validator

    Support for Ubuntu 18.04 Support for Fedora 31

    Contributions: - Encephalan driver: Alexey Minin from (UrFU) - GTec Unicorn driver: Anton Andreev (Gipsa-Lab) - Box pybox-manager: Jimmy Leblanc & Yannis Bendi-Ouis (Polymont ITS)

  • News of the Year:

    Python2 support dropped in favour of Python3 New feature boxes: - Riemannian geometry - Multimodal Graz visualisation - Artefact detection - Features selection - Stimulation validator

    Support for Ubuntu 18.04 Support for Fedora 31

  • URL:
  • Contact:
    Anatole Lecuyer
  • Participants:
    Cedric Riou, Thierry Gaugry, Anatole Lecuyer, Fabien Lotte, Jussi Tapio Lindgren, Laurent Bougrain, Maureen Clerc, Théodore Papadopoulo
  • Partners:
    INSERM, GIPSA-Lab

7.2 New platforms

7.2.1 Immerstar

Participants: Florian Nouviale, Ronan Gaugne.

URL: Immersia website

With the two virtual reality technological platforms Immersia and Immermove, grouped under the name Immerstar, the team has access to high-level scientific facilities. This equipment benefits the research teams of the center and has allowed them to extend their local, national and international collaborations. The Immerstar platform was granted by a CPER-Inria funding for the 2015-2019 period which had enabled several important evolutions. In particular, in 2018, a haptic system covering the entire volume of the Immersia platform was installed, allowing various configurations from single haptic device usage to dual haptic devices usage with either one or two users. In addition, a motion platform designed to introduce motion feedback for powered wheelchair simulations has also been incorporated (see Figure 2).

We celebrated the twentieth anniversary of the Immersia platform in November 2019 by inaugurating the new haptic equipment. We proposed scientific presentations and received 150 participants, and visits for the support services in which we received 50 persons.

Based on these support, in 2020, we participated to a PIA3-Equipex+ proposal that obtained a funding in 2021. This proposal CONTINUUM involves 22 partner, has been succesfully evaluated and will be granted. The CONTINUUM project will create a collaborative research infrastructure of 30 platforms located throughout France, to advance interdisciplinary research based on interaction between computer science and the human and social sciences. Thanks to CONTINUUM, 37 research teams will develop cutting-edge research programs focusing on visualization, immersion, interaction and collaboration, as well as on human perception, cognition and behaviour in virtual/augmented reality, with potential impact on societal issues. CONTINUUM enables a paradigm shift in the way we perceive, interact, and collaborate with complex digital data and digital worlds by putting humans at the center of the data processing workflows. The project will empower scientists, engineers and industry users with a highly interconnected network of high-performance visualization and immersive platforms to observe, manipulate, understand and share digital data, real-time multi-scale simulations, and virtual or augmented experiences. All platforms will feature facilities for remote collaboration with other platforms, as well as mobile equipment that can be lent to users to facilitate onboarding.

The Immerstar platform is involved in a new National Research Infrastructure since the end of 2021. This new research infrastructure gathers the main platforms of CONTINUUM.

Immerstar is also involved in EUR Digisport led by University of Rennes 2 and PIA4 DemoES AIR led by University of Rennes 1.

Figure 2.a
Figure 2.b
Immersia platform: (Left) “Scale-One” Haptic system for one or two users. (Right) Motion platform for a powered wheelchair simulation.
Figure 2: Immersia platform: (Left) “Scale-One” Haptic system for one or two users. (Right) Motion platform for a powered wheelchair simulation.

8 New results

8.1 Virtual Reality Tools and Usages

8.1.1 Towards “Avatar-Friendly” 3D Manipulation Techniques: Bridging the Gap Between Sense of Embodiment and Interaction in Virtual Reality

Participants: Diane Dewez, Ferran Argelaguet, Anatole Lécuyer.

Avatars, the users' virtual representations, are becoming ubiquitous in virtual reality applications. In this context, the avatar becomes the medium which enables users to manipulate objects in the virtual environment (see Figure 3). It also becomes the users' main spatial reference, which can not only alter their interaction with the virtual environment, but also the perception of themselves. In this work 34, we review and analyse the current state-of-the-art for 3D object manipulation and the sense of embodiment. Our analysis is twofold. First, we discuss the impact that the avatar can have on object manipulation. Second, we discuss how the different components of a manipulation technique (i.e. input, control and feedback) can influence the user's sense of embodiment. Throughout the analysis, we crystallise our discussion with practical guidelines for VR application designers and we propose several research topics towards “avatar-friendly” manipulation techniques.

This work was done in collaboration with the MimeTIC.

Figure 3
Figure 3: Action-perception loop: the user can control the avatar thanks to the input devices, so as to interact with the VE. In response to the user's actions, the VE provides feedback perceived by the user thanks to the output devices.

8.1.2 Understanding, Modeling and Simulating Unintended Positional Drift during Repetitive Steering Navigation Tasks in Virtual Reality

Participants: Hugo Brument, Ferran Argelaguet.

Virtual steering techniques enable users to navigate in larger Virtual Environments (VEs) than the physical workspace available. Even though these techniques do not require physical movement of the users (e.g. using a joystick and the head orientation to steer towards a virtual direction), recent work observed that users might unintentionally move in the physical workspace while navigating, resulting in Unintended Positional Drift (UPD). This phenomenon can be a safety issue since users may unintentionally reach the physical boundaries of the workspace while using a steering technique. In this context, as a necessary first step to improve the design of navigation techniques minimizing the UPD, this work aims at analyzing and modeling the UPD during a virtual navigation task 17. In particular, we characterize and analyze the UPD for a dataset containing the positions and orientations of eighteen users performing a virtual slalom task using virtual steering techniques. Participants wore a head-mounted display and had to follow three different sinusoidal-like trajectories (with low, medium and high curvature) using a torso-steering navigation technique. We analyzed the performed motions and proposed two UPD models: the first based on a linear regression analysis and the second based on a Gaussian Mixture Model (GMM) analysis. Then, we assessed both models through a simulation-based evaluation where we reproduced the same navigation task using virtual agents. Our results indicate the feasibility of using simulation-based evaluations to study UPD. The work concludes with a discussion of potential applications of the results in order to gain a better understanding of UPD during steering and therefore improve the design of navigation techniques by compensating for UPD.

This work was done in collaboration with the MimeTIC and Rainbow teams, and also the University Central Florida.

8.1.3 Studying the Influence of Translational and Rotational Motion on the Perception of Rotation Gains in Virtual Environments

Participants: Hugo Brument, Ferran Argelaguet.

Rotation gains in Virtual Reality (VR) enable the exploration of wider Virtual Environments (VEs) compared to the workspace users have in VR setups. The perception of these gains has been consequently explored through multiple experimental conditions in order to improve redirected navigation techniques. While most of the studies consider rotations, in which participants can rotate at the pace they desire but without translational motion, we have no information about the potential impact of the translational and rotational motions on the perception of rotation gains. In this work 33, we estimated the influence of these motions and compared the perceptual thresholds of rotations gains through a user study (n=14), in which participants had to perform virtual rotation tasks at a constant rotation speed (see Figure 4). Participants had to determine whether their virtual rotation speed was faster or slower than their real one. We varied the translational optical flow (static or forward motion), the rotational speed (20, 30, or 40 deg/s), and the rotational gain (from 0.5 to 1.5). The main results are that the rotation gains are less perceivable at lower rotation speeds and that translational motion makes detection more difficult at lower rotation speeds. Furthermore, the work provides insights into the user's gaze and body motions behaviour when exposed to rotation gains. These results contribute to the understanding of the perception of rotation gains in VEs and they are discussed to improve the implementation of rotation gains in redirection techniques.

Figure 4.a
Figure 4.b
Left - User wearing the HTC Vive Pro Eye HMD equipped with a wireless module, one HTC Vive tracker located at the pelvis and one HTC Vive controller. Right - User's point of view of the VE during the experiment. The black arrow indicates the direction of the turn. The sight and pink sphere were used for calibration purposes).
Figure 4: Left - User wearing the HTC Vive Pro Eye HMD equipped with a wireless module, one HTC Vive tracker located at the pelvis and one HTC Vive controller. Right - User's point of view of the VE during the experiment. The black arrow indicates the direction of the turn. The sight and pink sphere were used for calibration purposes).

This work was done in collaboration with the MimeTIC team.

8.1.4 VR based Power Wheelchair Simulator: Usability Evaluation through a Clinically Validated Task with Regular Users

Participants: Guillaume Vailland, Florian Nouviale, Valérie Gouranton.

Power wheelchairs are one of the main solutions for people with reduced mobility to maintain or regain autonomy and a comfortable and fulfilling life. However, driving a power wheelchair in a safe way is a difficult task that often requires training methods based on real-life situations. Although these methods are widely used in occupational therapy, they are often too complex to implement and unsuitable for some people with major difficulties. In this context, we collaborated with clinicians to develop a Virtual Reality based power wheelchair simulator 40. This simulator is an innovative training tool adapted to any type of situations and impairments (see Figure 5). In this work, we present a clinical study in which 29 power wheelchair regular users were asked to complete a clinically validated task designed by clinicians within two conditions: driving in a virtual environment with our simulator and driving in real conditions with a real power wheelchair. The objective of this study is to compare performances between the two conditions and to evaluate the Quality of Experience provided by our simulator in terms of Sense of Presence and Cybersickness. Results show that participants complete the tasks in a similar amount of time for both real and virtual conditions, using respectively a real power wheelchair and our simulator. Results also show that our simulator provides a high level of Sense of Presence and provokes only slight to moderate Cybersickness discomforts resulting in a valuable Quality of Experience.

Figure 5
Figure 5: Virtual Reality condition setup of our clinical study with our power wheelchair simulator.

This work was done in collaboration with the Rainbow team and the Pôle Saint Hélier.

8.1.5 Cubic Bézier Local Path Planner for Non-holonomic Feasible and Comfortable Path Generation

Participants: Guillaume Vailland, Valérie Gouranton.

In a transport context like power wheelchair navigation, passenger comfort should be a priority and influences path planning strategy. For non-holonomic robot navigation, path planning algorithms such as Rapidly-exploring Random Tree (RRT) rarely provides feasible and smooth path without the need of additional processing. In this work 41, we propose a local path planner which guarantees curvature bounded value and continuous Cubic Bézier piecewise curves connections. To simulate and test this Cubic Bézier local path planner, we developed a new RRT version (CBB-RRT*) which generates on-the fly comfortable path adapted to non-holonomic constraints (see Figure 6).

Figure 6
Figure 6: CBB-RRT* run within a complex environment. Red triangle represents the start state and its direction and red circle represents the goal. Blue path represent complex environment solution path.

This work was done in collaboration with the Rainbow team.

8.1.6 A Survey on Affective and Cognitive VR

Participants: Tiffany Luong, Ferran Argelaguet, Anatole Lécuyer.

In Virtual Reality (VR), users can be immersed in emotionally intense and cognitively engaging experiences. Yet, despite strong interest from scholars and a large amount of work associating VR and Affective and Cognitive States (ACS), there is a clear lack of structured and systematic form in which this research can be classified. We define “Affective and Cognitive VR” to relate to works which (1) induce ACS, (2) recognize ACS, or (3) exploit ACS by adapting virtual environments based on ACS measures. This survey 29 clarifies the different models of ACS, presents the methods for measuring them with their respective advantages and drawbacks in VR, and showcases Affective and Cognitive VR studies done in an immersive virtual environment (IVE) in a non-clinical context. Our article covers the main research lines in Affective and Cognitive VR. We provide a comprehensive list of references with the analysis of 63 research articles and summarize future directions (see Figure 7).

Figure 7
Figure 7: Loop depicting the adaptation of VR based on users' ACS, as well as the categorization of Affective and Cognitive VR studies in the survey. Stimuli in VR first induce AS or CS responses to the users. Then, these responses are measured from the users, and used to recognize the user's AS or CS. Finally, the recognized AS or CS is exploited to adjust the VR stimuli and parameters in real-time.

This work was done in collaboration with the IRT b<>com.

8.1.7 Training situational awareness for scrub nurses: Error recognition in a virtual operating room

Participants: Bruno Arnaldi, Valérie Gouranton.

Virtual reality simulation provides interesting opportunities to train nurses in a safe environment. While the virtual operating room has proven to be a useful training tool for technical skills, it has been less studied for non-technical skills. This study aimed to assess, error recognition in a virtual operating room, using a simulation scenario designed to improve situation awareness (see Figure 8). Eighteen scrub-nurse students and 8 expert scrub-nurses took part in the experiment. They were immersed in a virtual operating room and reported any errors they observed. There were nineteen errors with various degrees of severity. Measures were retrieved from logs (number of errors, time for detection, movements) and from questionnaires (situation awareness, subjective workload, anxiety and user experience). The results showed that the participants who detected most errors had a higher level of situation awareness, detected high-risk errors faster and felt more immersed in the virtual operating room than those detecting fewer errors. They also felt the workload was lighter and experienced more satisfaction. Students explored the operating room more than experts did and detected more errors, especially those with moderate risk. Debriefings confirmed that virtual simulation is acceptable to trainees and motivates them. It also provides useful and original material for debriefings 16.

This work was done in collaboration with LP3C, Rennes and LTSI, Inserm, Rennes.

Figure 8
Figure 8: The operating room during VR interaction.

8.1.8 Visual feedback improves movement illusions induced by tendon vibration after chronic stroke

Participants: Salomé Le Franc, Mélanie Cogné, Anatole Lécuyer.

Illusion of movement induced by tendon vibration is commonly used in rehabilitation and seems valuable for motor rehabilitation after stroke. The aim was to study if congruent visual cues using Virtual Reality (VR) could enhance the illusion of movement induced by tendon vibration of the wrist among participants with stroke . We included 20 chronic stroke participants. They experienced tendon vibration of their wrist inducing illusion of movement with 3 VR visual conditions: a congruent moving virtual hand (Moving condition); a static virtual hand (Static condition); or no virtual hand at all (Hidden condition). They evaluated the intensity of the illusory movement and they answered a questionnaire about their preferred condition. The Moving condition was significantly superior to the Hidden condition and to the Static condition in terms of illusion of movement (p<0.001) and the wrist's extension (p<0.001), and was considered the best one to increase the illusion of movement (in 70% of the participants). This study showed the interest of using congruent cues in VR in order to enhance the consistency of the illusion of movement among participants after stroke, regardless of their clinical severity.

Figure 9

Illustrations of the equipment (example of the positioning of a left arm with a paresis after right stroke) a-b) Installation of the vibrator. The forearm was covered with a cloth. c-d-e) Presentation of the 3 virtual visual conditions (respectively Moving, Hidden, Static condition). The arrow is not visible during the experiment. f) Protractor to measure the sensation of wrist's displacement. “-90°” signifies a maximal wrist extension for the left upper limb. The description “values of degree” and “wrist extension, wrist flexion” are not seen by the participant during the trial. g) Chronology of the trial.

Figure 9: Illustrations of the equipment (example of the positioning of a left arm with a paresis after right stroke) a-b) Installation of the vibrator. The forearm was covered with a cloth. c-d-e) Presentation of the 3 virtual visual conditions (respectively Moving, Hidden, Static condition). The arrow is not visible during the experiment. f) Protractor to measure the sensation of wrist's displacement. “-90°” signifies a maximal wrist extension for the left upper limb. The description “values of degree” and “wrist extension, wrist flexion” are not seen by the participant during the trial. g) Chronology of the trial.

8.1.9 Immersive VR neuropsychological assessment battery of everyday cognitive functions

Participants: Panagiotis Kourtesis.

The studies 22, 24 of this work examined the validity, usability, and utility of the VR Everyday Assessment Lab (VR-EAL). Clinical and research tools involving immersive VR may bring several advantages to cognitive neuroscience and neuropsychology. However, there are some technical and methodological pitfalls. The American Academy of Clinical Neuropsychology (AACN) and the National Academy of Neuropsychology (NAN) raised 8 key issues pertaining to Computerized Neuropsychological Assessment Devices. These issues pertain to: (1) the safety and effectivity; (2) the identity of the end-user; (3) the technical hardware and software features; (4) privacy and data security; (5) the psychometric properties; (6) examinee issues; (7) the use of reporting services; and (8) the reliability of the responses and results. The VR-EAL was found to be the first immersive VR neuropsychological battery with enhanced ecological validity for the assessment of everyday cognitive functions by offering a pleasant testing experience without inducing cybersickness. The VR-EAL met the criteria of the NAN and AACN, addressed the methodological pitfalls, and brought advantages for neuropsychological testing.These studies demonstrated the research and clinical utility of VR methods in cognitive neuroscience and neuropsychology.

8.1.10 An ecologically valid examination of event-based and time-based prospective memory using immersive VR

Participants: Panagiotis Kourtesis.

The studies 23, 21 of this work examined the everyday prospective memory ability and functioning. Using the VR-EAL, an immersive VR neuropsychological assessment tool, prospective memory in everyday life was examined by comparing performance on diverse prospective memory tasks (i.e., focal and non-focal event-based, and time-based tasks) and identifying the cognitive functions which predict everyday prospective memory functioning. The length of the delay between encoding and retrieving the prospective memory intention, and not the type of prospective memory task, appears to play a central role in everyday prospective memory. Also, everyday prospective memory functioning is predominantly facilitated by episodic memory, visuospatial attention, and executive functions.These findings further suggest the importance of ecological validity in the study of prospective memory, which may be achieved using immersive VR paradigms (see Figure 10).

Figure 10
Figure 10: VR-EAL Prospective Memory Tasks

8.2 Augmented Reality Tools and Usages

8.2.1 Being an Avatar “for Real”: A Survey on Virtual Embodiment in Augmented Reality

Participants: Adélaïde Genay, Anatole Lécuyer.

Virtual self-avatars have been increasingly used in Augmented Reality (AR) where one can see virtual content embedded into physical space. However, little is known about the perception of self-avatars in such a context. The possibility that their embodiment could be achieved in a similar way as in Virtual Reality opens the door to numerous applications in education, communication, entertainment, or the medical field. This article aims to review the literature covering the embodiment of virtual self-avatars in AR. The aim of this work is (i) to guide readers through the different options and challenges linked to the implementation of AR embodiment systems, (ii) to provide a better understanding of AR embodiment perception by classifying the existing knowledge, and (iii) to offer insight on future research topics and trends for AR and avatar research 19. To do so, we introduce a taxonomy of virtual embodiment experiences by defining a “body avatarization” continuum. The presented knowledge suggests that the sense of embodiment evolves in the same way in AR as in other settings, but this possibility has yet to be fully investigated. We suggest that, whilst it is yet to be well understood, the embodiment of avatars has a promising future in AR and conclude by discussing possible directions for research.

This work was done in collaboration with the POTIOC team.

8.2.2 Virtual, Real or Mixed: How Surrounding Objects Influence the Sense of Embodiment in Optical See-Through Experiences?

Participants: Adélaïde Genay, Anatole Lécuyer.

Following the literature analysis, this paper studies the sense of embodiment of virtual avatars in Mixed Reality (MR) environments visualized with an Optical See-Through display 20. We investigated whether the content of the surrounding environment could impact the user's perception of their avatar, when embodied from a first-person perspective. To do so, we conducted a user study comparing the sense of embodiment toward virtual robot hands in three environment contexts which included progressive quantities of virtual content: real content only, mixed virtual/real content, and virtual content only (see Figure 11). Taken together, our results suggest that users tend to accept virtual hands as their own more easily when the environment contains both virtual and real objects (mixed context), allowing them to better merge the two “worlds”. We discuss these results and raise research questions for future work to consider.

Figure 11
Figure 11: The three experimental conditions represented from the participant's point of view (Left) The “VIRTUAL” condition where all the objects were virtual (Middle) The “MIXED” condition where there were both real and virtual objects mixed (Right) The “REAL” condition where all objects were real. The 3D rendering allowed virtual objects to appear having the same 3D volume as their real counterparts.

This work was done in collaboration with the POTIOC team.

8.2.3 Needs Model for an Autonomous Agent during Long-term Simulations

Participants: Lysa Gramoli, Jérémy Lacoche, Anthony Foulonneau, Valérie Gouranton, Bruno Arnaldi.

Simulating human behavior is still a challenge. Yet, this feature is crucial to make virtual humans more believable. Therefore, many approaches have been proposed in order to find ways to faithfully simulate human behavior. One of the key features to make a virtual agent more believable is the simulation of needs such as hunger or tiredness. Unfortunately, most of the existing approaches in needs simulation do not really address the issue of long-term simulations where some problems may appear such as time drift. Yet, this kind of simulation is useful in many fields like video games or virtual data generation. In this work 43 we focus on the creation of a needs model designed for long-term simulations. According to this model, needs can evolve over several simulated days without interruption. This model is configured to obtain a proper relation between control and autonomy in order to have a coherent behavior during long periods. This work deals with the key features to set up this needs model and introduces some preliminary results to check the coherence of the agent behavior. To demonstrate our model, a smart house simulator was implemented with Unity Engine (see Fig. 12). In this simulator, the agent executes predefined actions after choosing the most accurate activity judging from the priorities of its needs. The generated virtual humans could be used to enrich augmented reality systems providing believable virtual humans.

This work was done in collaboration with Orange.

Figure 12
Figure 12: On the left, the Simulator. On the right, the agent moving in it.

8.2.4 “Doctor, please”: Educating nurses to speak up with interactive digital simulation tablets

Participants: Bruno Arnaldi, Valérie Gouranton.

Courses are developed to train on open communication. This study focuses on speaking-up for scrub nurses. The scenario is implemented on digital tablets, with vignettes involving problematic behaviours of a colleague with the same or different status. The nurses (N = 33) were asked whether they would point out the error, whether they would be embarrassed, and how they would do it. Nurses expressed greater embarrassment with a colleague of a different status. This is confirmed by their phrasing and the strategies they reported when speaking to the surgeon. The scenario was well accepted and could be used to train other health professionals 15.

This work was done in collaboration with LP3C, Rennes and LTSI, Inserm, Rennes

Figure 13
Figure 13: The virtual reality environment on tablet

8.3 Haptic Feedback

8.3.1 Electrotactile Feedback For Enhancing Contact Information in Virtual Reality

Participants: Sebastian Vizcay, Panagiotis Kourtesis, Ferran Argelaguet.

In the context of the TACTILITY project, we have been exploring the use of electrotactile feedback technology to enrich VR itneractions. In this work 42 we explored how a wearable electrotactile feedback system to enhance contact information for mid-air interactions with virtual objects (see Figure 14). In particular, we propose the use of electrotactile feedback to render the interpenetration distance between the user's finger and the virtual content is touched. Our approach consists of modulating the perceived intensity (frequency and pulse width modulation) of the electrotactile stimuli according to the registered interpenetration distance. In a user study (N=21), we assessed the performance of four different interpenetration feedback approaches: electrotactile-only, visual-only, electrotactile and visual, and no interpenetration feedback. First, the results showed that contact precision and accuracy were significantly improved when using interpenetration feedback. Second, and more interestingly, there were no significant differences between visual and electrotactile feedback when the calibration was optimized and the user was familiarized with electrotactile feedback. Taken together, these results suggest that electrotactile feedback could be an efficient replacement of visual feedback for enhancing contact information in virtual reality avoiding the need of active visual focus and the rendering of additional visual artefacts.

This work was done in collaboration with the Rainbow team.

Figure 14
Figure 14: Electrotactile Feedback for Contact in VR.

8.3.2 PUMAH: Pan-tilt Ultrasound Mid-Air Haptics

Participants: Anatole Lécuyer.

Focused ultrasound mid-air haptic interfaces are ideal for providing tactile feedback in Virtual Reality (VR), as they do not require the user to be tethered to, hold, or wear any device. Using an array of ultrasound emitters, they generate focused points of oscillating high pressure in mid-air, eliciting vibrotactile sensations when encountering a user's skin. These arrays feature a large vertical workspace, but are not capable of displaying stimuli far beyond their horizontal limits, severely limiting their workspace in the lateral dimensions. This demo presents the PUMAH 38, a low-cost 2 degrees-of-freedom robotic system rotating a focused ultrasound array around the pan and tilt axes, enabling multi-directional tactile feedback and increasing the array's workspace volume more than 14-fold.

This work was done in collaboration with the Rainbow team.

8.3.3 WeATaViX: Wearable Actuated Tangibles for Virtual Reality Experiences

Participants: Thomas Howard, Xavier de Tanguy, Anatole Lécuyer.

This demo 45 presented the WeATaViX: a wearable haptic interface for natural manipulation of tangible objects in Virtual Reality (VR). The device uses an interaction concept between encounter-type and tangible haptics, bringing a tangible object in and out of contact with a user's palm, allowing rendering of making and breaking of contact with virtual objects, as well as grasping and manipulation. The demo places the user in a virtual orchard in which they can experience the availability of tangible counterparts to any grasped interactible object in the virtual environment, as they pick and place apples, play fetch with a virtual dog and his ball, and even actuate the virtual handle of a juice machine preparing fresh virtual organic apple juice (see Figure 15).

This work was done in collaboration with the Rainbow team.

Figure 15
Figure 15: The WeATaViX interface - From Left to Right: CAD assembly, disengaged device (free hand), engaged device (user grasps a virtual object)

8.3.4 Capacitive Sensing for Improving Contact Rendering with Tangible Objects in VR

Participants: Xavier de Tinguy, Anatole Lécuyer.

In order to enhance the coupling between virtual and tangible surfaces, in this work we propose to combine tracking information from a tangible object instrumented with capacitive sensors and an optical tracking system 18. This system aims to improve contact rendering when interacting with tangibles in VR (see Figure 16). A human-subject study shows that combining capacitive sensing with optical tracking significantly improves the visuohaptic synchronization and immersion of the VR experience.

This work was done in collaboration with the Rainbow team.

Figure 16
Figure 16: Representative issues while using standard optical tracking systems (left) vs. our integrated capacitive-based approach (right). By combining tracking information from standard optical tracking systems with proximity information from a capacitive sensor, we are able to re-target the virtual fingertip toward the virtual surface, achieving a better synchronization between tangible and virtual contacts.

8.3.5 “Haptics On-Demand”: A Survey on Encountered-Type Haptic Displays

Participants: Victor Rodrigo Mercado, Anatole Lécuyer.

Encountered-Type Haptic Displays (ETHDs) provide haptic feedback by positioning a tangible surface for the user to encounter. This permits users to freely eliciting haptic feedback with a surface during a virtual simulation. ETHDs differ from most of current haptic devices which rely on an actuator always in contact with the user. This work 31 intends to describe and analyze the different research efforts carried out in this field. In addition, we further analyze ETHD literature concerning definitions, history, hardware, haptic perception processes involved, interactions and applications. The work finally proposes a formal definition of ETHDs, a taxonomy for classifying hardware types, and an analysis of haptic feedback used in literature. Taken together the overview of this survey intends to encourage future work in the ETHD field.

This work was done in collaboration with the Rainbow team.

8.3.6 ENTROPiA: Towards Infinite Surface Haptic Displays in Virtual Reality Using Encountered-Type Rotating Props

Participants: Victor Rodrigo Mercado, Anatole Lécuyer.

In this work 30 we propose an approach towards an infinite surface haptic display. Our approach, named ENcountered-Type ROtating Prop Approach (ENTROPiA) is based on a cylindrical spinning prop attached to a robot's end-effector serving as an encountered-type haptic display (ETHD). This type of haptic display permits the users to have an unconstrained, free-hand contact with a surface being provided by a robotic device for the users' to encounter a surface to be touched (see Figure 17). In our approach, the sensation of touching a virtual surface is given by an interaction technique that couples with the sliding movement of the prop under the users' finger by tracking their hand location and establishing a path to be explored. This approach enables large motion for a larger surface rendering, permits to render multi-textured haptic feedback, and leverages the ETHD approach introducing large motion and sliding/friction sensations. As a part of our contribution, a proof of concept was designed for illustrating our approach. A user study was conducted to assess the perception of our approach showing a significant performance for rendering the sensation of touching a large flat surface. Our approach could be used to render large haptic surfaces in applications such as rapid prototyping for automobile design.

This work was done in collaboration with the Rainbow team.

Figure 17
Figure 17: Global setup of our system. Our proposed rotating prop (bottomcenter) can provide a large haptic surface display together with different textures represented in a VE (top-center).

8.3.7 Alfred: the Haptic Butler On-Demand Tangibles for Object Manipulation in Virtual Reality using an ETHD

Participants: Victor Rodrigo Mercado, Ferran Argelaguet, Anatole Lécuyer.

We present “Alfred”, a novel haptic paradigm for object manipulation in 3D immersive virtual reality (VR) 39. It uses a robotic manipulator to move tangible objects in its workspace such that they match the pose of virtual objects to be interacted with. Users can then naturally touch, grasp and manipulate a virtual object while feeling congruent and realistic haptic feedback from the tangible proxy. The tangible proxies can detach from the robot, allowing natural and unconstrained manipulation in the 3D virtual environment (VE). When a manipulated virtual object comes into contact with the virtual environment, the robotic manipulator acts as an encounter-type haptic display (ETHD), positioning itself so as to render reaction forces of the environment onto the manipulated physical object. Here, we discuss the concept for this novel approach and present a simplified prototype using a single detachable tangible proxy supported by a UR5 industrial robot. Through illustrative use-cases in VR and a preliminary performance evaluation, we discuss implications for robot control and design of interaction techniques. We show that Alfred is adaptable to a wide range of virtual environments and interaction scenarios, making it a promising approach for haptic-enabled manipulation in VR, although system latency is a limitation that still remains to be addressed.

This work was done in collaboration with the Rainbow team.

8.4 Brain Computer Interfaces

8.4.1 Influence of the visuo-proprioceptive illusion of movement and motor imagery of the wrist on EEG cortical excitability among healthy participants

Participants: Salomé Le Franc, Mélanie Cogné, Anatole Lécuyer.

Motor Imagery (MI) is a powerful tool to stimulate sensorimotor brain. The aim of this study 25 was to evaluate whether an illusion of movement induced by visuo-proprioceptive immersion (VPI) including tendon vibration (TV) and Virtual moving hand (VR) combined with MI tasks could be more efficient than VPI alone or MI alone on cortical excitability assessed using Electroencephalography (EEG). We recorded EEG signals in 20 healthy participants in 3 different conditions: MI tasks involving their non-dominant wrist (MI condition); VPI condition; and VPI with MI tasks (combined condition). Our main judgment criterion was the Event-Related De-synchronization (ERD) threshold in sensori-motor areas in each condition in the brain motor area. The combined condition induced a greater change in the ERD percentage than the MI condition alone, but no significant difference was found between the combined and the VPI condition (p=0.07). This study demonstrated the interest of using a visuo-proprioceptive immersion with MI rather than MI alone in order to increase excitability in motor areas of the brain.

8.4.2 The impact of Neurofeedback on effective connectivity networks in chronic stroke patients: An exploratory study

Participants: Mathis Fleury, Anatole Lécuyer.

In this work 28 we assessed the impact of EEG-fMRI Neurofeedback (NF) training on connectivity strength and direction in bilateral motor cortices in chronic stroke patients. Most of the studies using NF or brain computer interfaces for stroke rehabilitation have assessed treatment effects focusing on successful activation of targeted cortical regions. However, given the crucial role of brain network reorganization for stroke recovery, our broader aim was to assess connectivity changes after a NF training protocol targeting localised motor areas. Approach We considered changes in fMRI connectivity after a multisession EEG-fMRI NF training targeting ipsilesional motor areas in nine stroke patients. We applied the Dynamic Causal Modeling and Parametric Empirical Bayes frameworks for the estimation of directed connectivity changes. We considered a motor network including both ipsilesional and contralesional premotor, supplementary and primary motor areas. Main results Our results indicate that NF upregulation of targeted areas (ipsilesional supplementary and primary motor areas) not only modulated activation patterns, but also had a more widespread impact on fMRI bilateral motor networks. In particular, inter-hemispheric connectivity between premotor and primary motor regions decreased, and ipsilesional self-inhibitory connections were reduced in strength, indicating an increase in activation during the NF motor task. Significance To the best of our knowledge, this is the first work that investigates fMRI connectivity changes elicited by training of localized motor targets in stroke. Our results open new perspectives in the understanding of large-scale effects of NF training and the design of more effective NF strategies, based on the pathophysiology underlying stroke-induced deficits.

8.5 Cultural Heritage

8.5.1 Immersive Volumetric Point Cloud Manipulation for Cultural Heritage

Participants: Rafik Drissi, Ronan Gaugne, Valérie Gouranton.

In this work we present a framework for an immersive and interactive 3D manipulation of volumetric point clouds in virtual reality35. The framework was designed to meet the needs of cultural heritage experts such as archaeologists or curators for use on cultural heritage artifacts. We propose a display infrastructure associated with a set of tools that allows users from the cultural heritage domain to interact directly with the point clouds within their study process (see Figure 18). The resulting framework allows an immersive navigation, interaction and real time segmentation.

Figure 18
Figure 18: Volumetric point clouds rendering and manipulation for a mummy cat.

8.5.2 Interaction with the Cagniard-Latour siren in Virtual Reality

Participants: Ronan Gaugne, Valérie Gouranton.

The siren, by Charles Cagniard de La Tour, is a 19th century acoustic measuring instrument which provided a method for evaluating the frequency of sounds. Invented in 1820, the first device was built in 1845 (see Figure 19). It produces sound by means of a mechanism based on a disc pierced with equidistant holes, actuated by a flow of air. A counter associated with the mechanism makes it possible to measure the number of revolutions of the disc. The University of Rennes 1 has such a siren in its scientific collection ((Bernard, 2018). The team for the conservation and promotion of this collection has joined the multidisciplinary research project ANR-FRQSC INTROSPECT to propose a functional digital replica of the siren, for the purpose of scientific mediation 36.

Figure 19
Figure 19: Left: the real Siren. Right: the siren in VR.

8.5.3 Perishable containers in the context of cremation: contribution of computed tomography

Participants: Ronan Gaugne.

This work provides feedback on experiences focused on the detection and identification of organic vessels used in a cremation context 26. We tested the integration of tomographic images in the excavation and recording protocols of a cinerary deposit in order to assess their contributions and challenges. The two case studies presented are dated to the Middle Bronze Age and are part of the ANR project “Introspection of archaeological material in the digital age”. The latter is interested in interactive digital introspection methods by combining computed tomography with 3D visualization technologies: virtual reality, tangible interactions and 3D printing (see Figure 20).

Figure 20
Figure 20: Reconstitution of a perishable vessel inside a funerary vase from the site of Bono, France

9 Bilateral contracts and grants with industry

9.1 Grants with Industry

InterDigital

Participants: Nicolas Olivier [contact], Ferran Argelaguet [contact].

This grant started in February 2019. It supports Nicolas's Olivier CIFRE PhD program with InterDigital company on “Avatar Stylization”. This PhD is co-supervised with the MimeTIC team.

Orange Labs

Participants: Lysa Gramoli [contact], Bruno Arnaldi [contact], Valérie Gouranton [contact].

This grant started in October 2020. It supports Lysa Gramoli's CIFRE PhD program with Orange Labs company on “Simulation of autonomous agents in connected virtual environments”.

Sogea Bretagne

Participants: Vincent Goupil [contact], Bruno Arnaldi [contact], Valérie Gouranton [contact].

This grant started in October 2020. It supports Vincent Goupil's CIFRE PhD program with Sogea Bretagne company on “Hospital 2.0: Generation of Virtual Reality Applications by BIM Extraction”.

10 Partnerships and cooperations

10.1 International initiatives

10.1.1 ANR-FRQSC INTROSPECT

Participants: Valérie Gouranton [contact], Bruno Arnaldi, Ronan Gaugne, Flavien Lécuyer, Adrien Reuzeau.

  • Title:
    INTROSPECTion of archaeological data in the digital age
  • Duration:
    Sep. 2016 - Jun. 2021
  • Coordinator:
    INSA Rennes (France)
  • Partners:
    • INSA Rennes (France)
    • Université Laval (Québec, Canada),
    • Inrap (France),
    • INRS ETE (Québec, Canada),
    • CNRS (France),
    • Université de Rennes 1 (France),
    • Image ET (France)
    • Musée des Abenakis (Odanak, Québec, Canada)
    • Ilôt de Palais (Québec, Canada)
    • ENS Rennes (France)
  • Inria contact:
    Valérie Gouranton
  • Summary:
    INTROSPECT is a multidisciplinary project funded by French ANR and “Fonds de Recherche Société et Culture” (FRQSC) from Quebec region, Canada. This international collaboration involves researchers in computer science and archeology from France and Canada : Hybrid (Inria-IRISA), CReAAH, Inrap, company Image ET, University Laval and INRS-ETE. INTROSPECT aims to develop new uses and tools for archaeologists that facilitate access to knowledge through interactive numerical introspection methods that combine computed tomography with 3D visualization technologies, such as Virtual Reality, tangible interactions and 3D printing. The scientific core of the project is the systematization of the relationship between the artefact, the archaeological context, the digital object and the virtual reconstruction of the archaeological context that represents it and its tangible double resulting from the 3D printing. This axiomatization of its innovative methods makes it possible to enhance our research on our heritage and to make use of accessible digital means of dissemination. This approach changes from traditional methods and applies to specific archaeological problems. Several case studies will be studied in various archaeological contexts on both sides of the Atlantic. Quebec museums are also partners in the project to spread the results among the general public.

10.1.2 Associate Teams in the framework of an Inria International Lab or in the framework of an Inria International Program

  • FRANTIC:
    French-Russian Advanced and Novel TactIle Cyberworlds
  • Partner Institution(s):
    • Inria (Rainbow and Hybrid teams), France
    • Skoltech, Russia
  • Date/Duration:
    2021-2023
  • Additionnal info/keywords:
    Haptics, Extended Reality, Tactile, 3D Interaction

10.1.3 Informal International Partners

  • Dr. Takuji Narumi and Prof. Michitaka Hirose from University of Tokyo (Japan), on “Virtual Embodiment”
  • Dr. Gerd Bruder from University of Central Florida (USA), on “Virtual Navigation”
  • Prof. Gudrun Klinker from Technical University of Munich (Germany), on “Augmented Reality”

10.2 International research visitors

10.2.1 Visits of international scientists

Yutaro Hirao
  • Status:
    PhD
  • Institution of origin:
    University of Tokyo
  • Country:
    Japan
  • Dates:
    November 2021-June 2023
  • Context of the visit:
    PhD Thesis
  • Mobility program/type of mobility:
    Research stay
Chaarlotte Sabaux
  • Status:
    PhD
  • Institution of origin:
    University of Ghent
  • Country:
    Belgium
  • Dates:
    July 2021
  • Context of the visit:
    PhD Thesis
  • Mobility program/type of mobility:
    Research stay

10.3 European initiatives

10.3.1 FP7 & H2020 projects

TACTILITY

Participants: Ferran Argelaguet [contact], Anatole Lécuyer, Panagiotis Kourtesis, Sebastian Vizcay.

  • Title:
    TACTIle feedback enriched virtual interaction through virtual realITY and beyond
  • Duration:
    Jul. 2019 - Sept. 2022
  • Coordinator:
    Fundación Tecnalia Research and Innovation (Spain)
  • Partners:
    • Aalborg University (Netherlands)
    • Universita Degli Studi di Genova (Itali),
    • Tecnalia Servia (Servia),
    • Universitat de Valencia (Spain),
    • Manus Machinae B.V. (Netherlands),
    • Smartex S.R.L (Italy),
    • Immersion (France)
  • Inria contact:
    Ferran Argelaguet
  • Summary:
    TACTILITY is a multidisciplinary innovation and research action with the overall aim of including rich and meaningful tactile information into the novel interaction systems through technology for closed-loop tactile interaction with virtual environments. By mimicking the characteristics of the natural tactile feedback, it will substantially increase the quality of immersive VR experience used locally or remotely (tele-manipulation). The approach is based on transcutaneous electro-tactile stimulation delivered through electrical pulses with high resolution spatio-temporal distribution. To achieve it, significant development of technologies for transcutaneous stimulation, textile-based multi-pad electrodes and tactile sensation electronic skin, coupled with ground-breaking research of perception of elicited tactile sensations in VR, is needed. The key novelty is in the combination of: 1) the ground-breaking research of perception of electrotactile stimuli for the identification of the stimulation parameters and methods that evoke natural like tactile sensations, 2) the advanced hardware, that will integrate the novel high-resolution electrotactile stimulation system and state of the art artificial electronic skin patches with smart textile technologies and VR control devices in a wearable mobile system, and 3) the novel firmware, that handles real-time encoding and transmission of tactile information from virtual objects in VR, as well as from the distant tactile sensors (artificial skins) placed on robotic or human hands. Proposed research and innovation action would result in a next generation of interactive systems with higher quality experience for both local and remote (e.g., tele-manipulation) applications. Ultimately, TACTILITY will enable high fidelity experience through low-cost, user friendly, wearable and mobile technology.
H-Reality

Participants: Anatole Lécuyer [contact], Thomas Howard.

  • Title:
    Mixed Haptic Feedback for Mid-Air Interactions in Virtual and Augmented Realities
  • Duration:
    2018 - 2021
  • Coordinator:
    Univ. Birmingham (UK)
  • Partners:
    • Univ. Birmingham (UK)
    • CNRS (France),
    • TU Delft (Netherlands),
    • ACTRONIKA (France),
    • ULTRAHAPTICS (UK)
  • Inria contact:
    Anatole Lécuyer
  • Summary:
    The vision of H-REALITY is to be the first to imbue virtual objects with a physical presence, providing a revolutionary, untethered, virtual-haptic reality: H-Reality. This ambition will be achieved by integrating the commercial pioneers of ultrasonic “non-contact” haptics, state-of-the-art vibrotactile actuators, novel mathematical and tribological modelling of the skin and mechanics of touch, and experts in the psychophysical rendering of sensation. The result will be a sensory experience where digital 3D shapes and textures are made manifest in real space via modulated, focused, ultrasound, ready for the unteathered hand to feel, where next-generation wearable haptic rings provide directional vibrotactile stimulation, informing users of an object's dynamics, and where computational renderings of specific materials can be distinguished via their surface properties. The implications of this technology will transform online interactions; dangerous machinery will be operated virtually from the safety of the home, and surgeons will hone their skills on thin air.

10.3.2 Other european programs/initiatives

Interreg ADAPT

Participants: Valérie Gouranton [contact], Bruno Arnaldi, Ronan Gaugne, Florian Nouviale, Alexandre Audinot.

  • Title:
    Assistive Devices for empowering disAbled People through robotic Technologies
  • Duration:
    01/2017 - 06/2021
  • Coordinator:
    ESIGELEC/IRSEEM Rouen
  • Partners:
    • INSA Rennes - IRISA, LGCGM, IETR (France), Université de Picardie Jules Verne - MIS (France), Pôle Saint Hélier (France), CHU Rouen (France), Réseau Breizh PC (France), Ergovie (France), Pôle TES (France), University College of London - Aspire CREATE (UK), University of Kent (UK), East Kent Hospitals Univ NHS Found. Trust (UK), Health and Europe Centre (UK), Plymouth Hospitals NHS Trust (UK), Canterbury Christ Church University (UK), Kent Surrey Sussex Academic Health Science Network (UK), Cornwall Mobility Center (UK).
  • Inria contact:
    Valérie Gouranton
  • Summary:
    The ADAPT project aims to develop innovative assistive technologies in order to support the autonomy and to enhance the mobility of power wheelchair users with severe physical/cognitive disabilities. In particular, the objective is to design and evaluate a power wheelchair simulator as well as to design a multi-layer driving assistance system.

10.4 National initiatives

10.4.1 ANR

ANR LOBBY-BOT

Participants: Anatole Lécuyer [contact], Victor Mercado.

  • Duration:
    2017 - 2021
  • Coordinator:
    CLARTE
  • Partners:
    Inria Rennes (Hybrid), RENAULT, and LS2N]
  • Summary:
    The objective of LOBBY-BOT is to address the scientific challenges of encountered-type haptic devices (ETHD), which are an alternative category of haptic devices relying on a mobile physical prop, usually actuated by a robot, that constantly follows the user hand, and encounter it only when needed. The project follows two research axes: a first one dealing with robot control, and the second one dealing with interaction techniques adapted to ETHD. The involvement of Hybrid relates to the second research axis of the project. The final project prototype will be used to assess the benefits of ETHD when used in an industrial use-case : the perceived quality in an automotive interior.
  • URL:
ANR GRASP-IT

Participants: Anatole Lécuyer [contact], Mélanie Cogné, Salomé Lefranc.

  • Duration:
    2020 - 2024
  • Coordinator:
    LORIA
  • Partners:
    Inria Rennes (Hybrid), Inria Sophia, PErSEUs, CHU Rennes, CHU Toulouse, IRR UGECAM-N, and Alchimies.
  • Summary:
    The GRASP-IT project aims to recover upper limb control improving the kinesthetic motor imagery (KMI) generation of post-stroke patients using a tangible and haptic interface within a gamified Brain-Computer Interface (BCI) training environment. This innovative KMI-based BCI will integrate complementary modalities of interactions such as tangible and haptic interactions in a 3D printable flexible orthosis. We propose to design and test usability (including efficacy towards the stimulation of the motor cortex) and acceptability of this multimodal BCI. The GRASP-IT project also proposes to design and integrate a gamified non-immersive virtual environment to interact with. This multimodal solution should provide a more meaningful, engaging and compelling stroke rehabilitation training program based on KMI production. In the end, the project will integrate and evaluate neurofeedbacks, within the gamified multimodal BCI in an ambitious clinical evaluation with 75 hemiplegic patients in 3 different rehabilitation centers in France.
PIA4 DemoES AIR

Participants: Valérie Gouranton [contact], Bruno Arnaldi, Ronan Gaugne.

  • Duration:
    08/2021 - 07/2024
  • Coordinator:
    Université de Rennes 1
  • Description:
    The project Augmenter les Interactions à Rennes (AIR) is one of the 17 laureates chosen by the French government as part of the call for expressions of interest “Digital demonstrators in higher education” (DemoES) launched by the Ministry. of Higher Education, Research and Innovation.

Designed to overcome the artificial opposition between social learning and digital, the AIR project is structured around 3 complementary axes:

  • An augmented campus to facilitate social interactions across all activities (training, services, exchanges and social relations) and ensure their continuum as an extension of physical campuses, implemented in partnership with Orange Labs, a member of the consortium, with the support for other EdTech players such as Appscho or Jalios.
  • Interactive pedagogies to increase interactions in training and optimize, through interactivity, learning, ranging from the development of serious games to the use of immersive technologies (virtual reality, augmented reality, mixed reality), by developing functionalities resulting from projects research carried out within the Hybrid team, in Irisa, by intensifying the partnership established since 2018 with Klaxoon or by relying on Artefacto's immersive solutions.
  • An ecosystem of support for pedagogical and digital transformations to promote the appropriation by teachers of these new large-scale devices, in particular thanks to the time allocated dedicated to these transformations and to offer a recomposed and plural proximity assistance to teaching teams.
PIA4 Equipex+ Continuum

Participants: Ronan Gaugne [contact], Florian Nouviale.

  • Duration:
    06/2021 - 05/2028
  • Coordinator:
    CNRS
  • Description:
    CONTINUUM is an 8 years EquipEx + project led by the CNRS as part of the 4th Future Investments Program (PIA4). Endowed with € 13.6M, the project will create a collaborative research infrastructure of 30 platforms located throughout France, in order to advance interdisciplinary research between IT and the human and social sciences. Through CONTINUUM, 37 research teams will develop cutting-edge research focused on visualization, immersion, interaction and collaboration, as well as human perception, cognition and behavior in virtual augmented reality.

CONTINUUM is organized along two axes:

  1. Interdisciplinary research on the interaction, in collaboration between computing and human and social sciences, in order to increase knowledge and solutions in human-centered computing;
  2. Deployment of tools and services to meet the needs of many scientific fields in terms of access to big data, simulations and virtual / augmented experiences (mathematics, physics, biology, engineering, computer science, medicine, psychology, didactics, history , archeology, sociology, etc.)

By developing the instrument itself and using it in different fields of application, CONTINUUM will promote interdisciplinary research in order to better understand how to interact with the digital world and to enable advances in other fields of science and technology. 'engineering.

10.4.2 Inria projects

Inria Challenge AVATAR

Participants: Anatole Lécuyer [contact], Ferran Argelaguet [contact], Diane Dewez, Maé Mavromatis.

  • Duration:
    2018 - 2022
  • Coordinator:
    MimeTIC
  • Partners:
    Hybrid, Potioc, Loki, Graphdeco, Morpheo
  • External partners:
    Univ. Bacelona, Faurecia and InderDigital
  • Description:
    AVATAR aims at designing avatars (i.e., the user’s representation in virtual environments) that are better embodied, more interactive and more social, through improving all the pipeline related to avatars, from acquisition and simulation, to designing novel interaction paradigms and multi-sensory feedback.
  • URL:
Inria Challenge NAVISCOPE

Participants: Ferran Argelaguet [contact], Gwendal Fouché.

  • Duration:
    2019 - 2023
  • Coordinator:
    Serpico
  • Partners:
    Aviz, Beagle, Hybrid, Mosaic, Parietal, Morpheme
  • External partners:
    INRA and Institute Curie
  • Description:
    NAVISCOPE aims at improving visualization and machine learning methods in order to provide systems capable to assist the scientist to obtain a better understanding of massive amounts of information.
  • URL:
Inria Covid-19 Project VERARE

Participants: Mélanie Cogné [contact], Anatole Lécuyer, Justine Saint-Aubert, Valérie Gouranton, Ferran Argelaguet, Florian Nouviale, Ronan Gaugne.

VERARE (Virtual Environments for Rehabilitation After REsuscitation) is a 2-year research project funded by Inria (specific Inria Covid-19 call for projects) for assessing the efficacy of using Virtual Reality for motor rehabilitation (improving walk recovery) after resuscitation. This ambitious clinical project gathers Hybrid team, federating 18 members of the team, and the University Hospital of Rennes (Physical and Rehabilitation Medicine Unit and Intensive Care Units).

10.5 Regional initiatives

CHU Rennes Project HANDS

Participants: Mélanie Cogné [contact], Anatole Lécuyer, Salomé Le Franc.

HANDS (HAptic Neurofeedback Design for Stroke) is a 3-year project funded by the University Hospital of Rennes (CHU Rennes) for assessing the influence of the association of haptic feedback and virtual reality during Neurofeedback for improving upper limb motor function after stroke. This project covers the PhD grant of Salomé Lefranc.

INCR Project ARIADE

Participants: Mélanie Cogné [contact], Guillaume Moreau, Léa Pillette, Anatole Lécuyer.

ARIADE (Augmented Reality for Improving Navigation in Dementia) is a 3-year research project funded by the INCR (Institut des Neurosciences Cliniques de Rennes) and the CORECT of the University Hospital of Rennes for assessing the acceptability, efficacy and tolerance of visual cues administered using augmented reality for improving spatial navigation of participants with Alzheimer's disease who suffer from a topographical disorientation. This clinical project involves Hybrid, Empenn, Ecole Centrale de Nantes, and the University Hospital of Rennes (Physical and Rehabilitation Medicine Unit and Neurological Unit).

11 Dissemination

11.1 Promoting scientific activities

Member of the organizing committees
  • Florian Nouviale, Ronan Gaugne and Valérie Gouranton : Organisation Committee of “Journées Science et Musique 2021” (Rennes, France)
Chair of conference program committees
  • Ferran Argelaguet was Program chair for the conference ICAT-EGVE 2021
  • Guillaume Moreau was Program chair for the journal track of IEEE ISMAR 2021
Member of the conference program committees
  • Anatole Lécuyer was member of the IPC for IEEE VR Journal track 2021.
  • Ferran Argelaguet was member of the IPC for IEEE ISMAR Journal track 2021, EuroXR 2021 and ACM SAP 2021.
  • Valérie Gouranton was member of the IPC for ACM HUCAPP 2021
  • Jean-Marie Normand was member of the IPC for IEEE VR Conference track 2021, IEEE ISMAR 2021 Journal track and SIGGRAPH Asia 2021 XR.
Reviewer
  • Anatole Lécuyer was reviewer for IEEE 3DUI Contest 2021.
  • Ferran Argelaguet was reviewer for IEEE ISMAR 2021, IEEE VR 2021, ACM CHI 2021, Siggraph ASIA XR, ACM SUI.
  • Jean-Marie Normand was reviewer for IEEE ISMAR 2021 Conference track, IEEE VR 2021 Conference Track, Siggraph ASIA 2021 XR

11.1.1 Journal

Member of the editorial boards
  • Anatole Lécuyer is Associate Editor of the IEEE Transactions on Visualization and Computer Graphics, Frontiers in Virtual Reality, and Presence journals.
  • Valérie Gouranton is Review Editor of Frontiers in Virtual Reality.
  • Guillaume Moreau is Review Editor of Frontiers in Virtual Reality.
  • Jean-Marie Normand is Review Editor of Frontiers in Virtual Reality.
Reviewer - reviewing activities
  • Ferran Argelaguet was reviewer for IEEE Transactions on Visualizations and Computer Graphics, IEEE Transactions on Haptics, Computers & Graphics.
  • Guillaume Moreau was reviewer for IEEE Transactions on Visualizations and Computer Graphics
  • Jean-Marie Normand was reviewers for Computers & Graphics.

11.1.2 Invited talks

  • Anatole Lécuyer was keynote speaker for the "International Symposium on Visual Computing" 2021
  • Ferran Argelaguet was keynote speaker for the User-Embodied Interaction in VR workshop and the “Congreso Español de Informática Gráfica” (Spain).

11.1.3 Leadership within the scientific community

  • Anatole Lécuyer is Member of the Steering Committee of the IEEE VR Conference, and Member of the Scientific Board of INCR ("Institut des Neurosciences Cliniques de Rennes").
  • Ronan Gaugne is Member of the Consortium 3D of TGIR HumaNum.
  • Valérie Gouranton is Member of the Consortium 3D of TGIR HumaNum.
  • Guillaume Moreau is Member of the Steering Committee of the IEEE ISMAR Conference.

11.1.4 Scientific expertise

  • Ronan Gaugne is Member of the Expert Committee for the French cluster “Pôle Images et Réseaux”, and was expert for the EU I4MS project Change2Twin open calls Digital Twins Assessement and Deployement Vouchers in May and November 2021

11.1.5 Research administration

  • Ferran Argelaguet is in the scientific board of the EUR Digisport since 2021.
  • Ronan Gaugne is an elected member of the IRISA laboratory council since 2013
  • Valérie Gouranton is Head of cross-cutting Axis “Art, Heritage & Culture” at IRISA UMR 6074, elected member of the IRISA laboratory council and she is a member of the Conseil National des Universités 27th section (computer science).
  • Guillaume Moreau is Deputy Dean for Research and Innovation at IMT Atlantique since Dec. 2020

11.2 Teaching - Supervision - Juries

11.2.1 Teaching

Anatole Lécuyer:

  • Master AI-ViC: “Haptic Interaction and Brain-Computer Interfaces”, 6h, M2, Ecole Polytechnique, FR
  • Master MNRV: “Haptic Interaction”, 9h, M2, ENSAM, Laval, FR
  • Master SIBM: “Haptic and Brain-Computer Interfaces”, 4.5h, M2, University of Rennes 1, FR
  • Master CN: “Haptic Interaction and Brain-Computer Interfaces”, 9h, M1 and M2, University of Rennes 2, FR
  • Master SIF: “Pseudo-Haptics and Brain-Computer Interfaces”, 6h, M2, INSA Rennes, FR

Bruno Arnaldi:

  • Master SIF: “VRI: Virtual Reality and Multi-Sensory Interaction Course”, 4h, M2, INSA Rennes, FR
  • Master INSA Rennes: “CG: Computer Graphics”, 12h, M2, INSA Rennes, FR
  • Master INSA Rennes: “Virtual Reality”, courses 6h, projects 16h, M1 and M2, INSA Rennes, FR
  • Master INSA Rennes: Projects on “Virtual Reality”, 50h, M1, INSA Rennes, FR

Ferran Argelaguet:

  • Master STS Informatique: “Techniques d'Interaction Avancées”, 26h, M2, ISTIC, University of Rennes 1, FR
  • Master SIF: “Virtual Reality and Multi-Sensory Interaction”, 8h, M2, INSA Rennes, FR
  • Master SIF: “Data Mining and Visualization”, 2h, M2, University of Rennes 1, FR
  • Master AI-ViC: “Virtual Reality and 3D Interaction”, 3h, M2, École Polytechnique, FR

Valérie Gouranton:

  • Licence: Project on “Virtual Reality”, 16h, L3 and responsible of this lecture, INSA Rennes, FR
  • Master INSA Rennes: “Virtual Reality”, 13h, M2, INSA Rennes, FR
  • Master INSA Rennes: Projects on “Virtual Reality”, 50h, M1, INSA Rennes, FR
  • Master CN: “Virtual Reality”, 5h, M1, University of Rennes 2, FR

Ronan Gaugne:

  • INSA Rennes: Projects on “Virtual Reality”, 24h, L3, Insa Rennes, FR
  • Master Digital Creation: “Virtual Reality”, 6h, M1, University of Rennes 2, FR

Guillaume Moreau:

  • Virtual Reality Major, “C++ Programming for VR”, 30h, M1/M2, École Centrale de Nantes, FR
  • Virtual Reality Major, “Fundamentals of Virtual Reality”, 6h, M1/M2, École Centrale de Nantes, FR
  • Virtual Reality Major, “Computer Graphics”, 4h, M1/M2, École Centrale de Nantes, FR
  • Virtual Reality Major, “Advanced Software Development”, 20h, M1/M2, École Centrale de Nantes, FR
  • Computer Science Major, “Discrete Mathematics”, 10h, M1/M2, École Centrale de Nantes, FR

Jean-Marie Normand:

  • Virtual Reality Major, “Computer Graphics”, 24h, M1/M2, École Centrale de Nantes, FR
  • Virtual Reality Major, “Fundamentals of Virtual Reality”, 14h, M1/M2, École Centrale de Nantes, FR
  • Virtual Reality Major, “Computer Vision and Augmented Reality”, 26h, M1/M2, École Centrale de Nantes, FR
  • Virtual Reality Major, “Advanced Concepts in VR/AR”, 24h, M1/M2, École Centrale de Nantes, FR
  • Virtual Reality Major, “Projects on Virtual Reality”, 20h, M1/M2, École Centrale de Nantes, FR
  • Computer Science major, “Object-Oriented Programming in Java”, 30h, M1/M2, École Centrale de Nantes, FR
  • Computer Science major, “Web Programming”, 24h, M1/M2, École Centrale de Nantes, FR

11.2.2 Supervision

  • PhD: Tiffany Luong, "Affective VR: acquisition, modelling, and exploitation of affective states in virtual reality", Defended 10th February 2021, Supervised by Anatole Lécuyer, Nicolas Martin (b<>com), Ferran Argelaguet.
  • PhD: Mathis Fleury, "Neurofeedback based on fMRI and EEG", Defended 26th February 2021, Supervised by Anatole Lécuyer and Christian Barillot (Empenn, Inria)
  • PhD: Victor Rodrigo Mercado Garcia, "Encountered-type haptics", Defended 16th November 2021, Supervised by Anatole Lécuyer
  • PhD: Diane Dewez, "Avatar-Based Interaction in Virtual Reality", Defended 29th November 2021, Supervised by Anatole Lécuyer, Ferran Argelaguet and Ludovic Hoyet (MimeTIC, Inria)
  • PhD: Hugo Brument, "Towards user-adapted interaction techniques based on human locomotion laws for navigating in virtual environments", Defended 3rd December 2021, Supervised by Ferran Argelaguet, Maud Marchal (Rainbow, Inria) and Anne-Hélène Olivier (MimeTIC, Inria)
  • PhD: Guillaume Vailland, “Outdoor wheelchair assisted navigation: reality versus virtuality”, Defended 14th December 2021, Supervised by Valérie Gouranton and Marie Babel (Rainbow, Inria)
  • PhD in progress: Nicolas Olivier “Avatar stylization”, Started in January 2018. Supervised by Franck Multon (MimeTIC team) and Ferran Argelaguet
  • PhD in progress: Gwendal Fouché, “Immersive Interaction and Visualization of Temporal 3D Data”, Started in October 2019, Supervised by Ferran Argelaguet, Charles Kervrann (Serpico Team) and Emmanuelle Faure (Mosaic Team).
  • PhD in progress: Adélaïde Genay, “Embodiment in Augmented Reality”, Started in October 2019, Supervised by Anatole Lécuyer, Martin Hachet (Potioc, Inria)
  • PhD in progress: Martin Guy, “Physiological markers for characterizing virtual embodiment”, Started in October 2019, Supervised by Guillaume Moreau, Jean-Marie Normand (ECN) and Camille Jeunet (CNRS, CLEE)
  • PhD in progress: Grégoire Richard, “Touching Avatars: The role of haptic feedback in virtual embodiment”, Started in October 2019, Supervised by Géry Casiez (Loki, Inria), Thomas Pietzrak (Loki, Inria), Anatole Lécuyer and Ferran Argelaguet
  • PhD in progress: Sebastian Vizcay, “Dexterous Interaction in Virtual Reality using High-Density Electrotactile Feedback”, Started in November 2019, Supervised by Ferran Argelaguet, Maud Marchal and Claudio Pacchierotti (Rainbow, Inria) .
  • PhD in progress: Salomé Le Franc, “Haptic Neurofeedback Design for Stroke”, Started in January 2019, Supervised by Anatole Lécuyer, Isabelle Bonan (CHU Rennes) and Mélanie Cogné.
  • PhD in progress: Lysa Gramoli, “Simulation of autonomous agents in connected virtual environments”, Started in October 2020, Supervised by Valérie Gouranton, Bruno Arnaldi, Jérémy Lacoche (Orange), Anthony Foulonneau (Orange).
  • PhD in progress: Vincent Goupil, “Hospital 2.0: Generation of Virtual Reality Applications by BIM Extraction”, Started in October 2020, Supervised by Valérie Gouranton, Bruno Arnaldi, Anne-Solène Michaud (Vinci Construction).
  • PhD in progress: Gabriela Herrera “Neurofeefdback based on VR and haptics”, Started in January 2021. Supervised by Laurent Bougrain (LORIA), Stéphanie Fleck (Univ. Lorraine), and Anatole Lécuyer.
  • PhD in progress: Maé Mavromatis, “Towards « Avatar-Friendly » Characterization of VR Interaction Methods”, Started in October 2021, Supervised by Anatole Lécuyer, Ferran Argelaguet and Ludovic Hoyet (MimeTIC, Inria)
  • PhD in progress: Antonin Cheymol, “Body-based Interfaces in Mixed Reality for Urban Applications”, Started in November 2021, Supervised by Anatole Lécuyer, Ferran Argelaguet, Jean-Marie Normand (ECN) and Rebecca Fribourg (ECN).
  • PhD in progress: Yann Moullec, “Walking Sensations in VR”, Started in October 2021, Supervised by Anatole Lécuyer and Mélanie Cogné.
  • PhD in progress: Emilie Hummel, “Rehabilitation post-Cancer based on VR”, Started in October 2021, Supervised by Anatole Lécuyer, Valérie Gouranton and Mélanie Cogné.
  • PhD in progress: Antoine Cellier, “Wheelchair Simulation and Design in VR”, Started in October 2021, Supervised by Valérie Gouranton and Marie Babel (Rainbow, Inria).

11.2.3 Juries

  • Anatole Lécuyer was Rapporteur for the PhD Theses of Oscar Ariza (Univ. Hamburg) and Tania Johnston (Univ. Barcelona), Examiner for the PhD Thesis of Fabien Boucaud (UTC) and HDR Thesis of Marc Macé (IRIT), and President for the PhD Thesis of Charles Mille (ENSAM). Anatole Lécuyer was Member of Selection Committees for : Full Professor Position at Univ. Saclay, and Associate Professor at ENSAM Laval.
  • Ferran Argelaguet was Referee for the PhD Theses of Mickaël Sereno and Yiran Zhang, and external reviewer for the PhD Thesis of Alejandros Rios.
  • Bruno Arnaldi was President for the PhD Thesis of Victor Mercado (IRISA/Inria Rennes), Alexis Souchet (Univ. Paris 8), Antonin Bernardin (IRISA/Inria Rennes) and Rapporteur for the HDR Thesis of David Panzoli (IRIT) and Laure Leroy (Univ. Paris 8).
  • Valérie Gouranton was Referee for the PhD Thesis of Mehdi Hafsia and Vincent Reynaert.
  • Guillaume Moreau was President and Referee for the PhD Thesis of Hugo Brument (Inria).
  • Jean-Marie Normand was Member of Selection Committee for Associate Professor at ENSAM Laval.

11.3 Popularization

11.3.1 Articles and Media appearances

  • WebTV: “Le Blob - Cité des Sciences”: Anatole Lécuyer, Valérie Gouranton, Ronan Gaugne, FLorian Nouviale (June)
  • WebTV: “L'esprit sorcier”: Anatole Lécuyer (October)
  • Journal: “Epsiloon”: Guillaume Moreau, Anatole Lécuyer (November)
  • Radio: “France Culture - La méthode scientifique”: Guillaume Moreau (November)
  • Newspaper: “20 Minutes”: Guillaume Moreau, Anatole Lécuyer (November)
  • Journal: “Bikini” : Guillaume Moreau, Anatole Lécuyer (November)
  • Newspaper: “Ouest-France”: Mélanie Cogné (November)
  • Website: “Infirmiers.com”: Mélanie Cogné (November)
  • Journal: “Sciences et Avenir”: Anatole Lécuyer (December)
  • Radio: “Radio Laser”: Bruno Arnaldi (December)

11.3.2 Education

  • “L'essentiel sur la réalité virtuelle”: on-line series of articles on virtual reality on Inria website, Anatole Lécuyer, Valérie Gouranton, Ronan Gaugne (November)

11.3.3 Interventions

  • Webinar “TACTILTY: Natural Electro-tactile feedback for immersive virtual reality: Ferran Argelaguet (February)
  • Presentation at “TedX Rennes”: Anatole Lécuyer (September)
  • Presentation and Demos “Journées Européennes du Patrimoine”: Valérie Gouranton, Ronan Gaugne, Florian Nouviale (October)
  • Presentation at “MK2 Bibliothèque Cinema theater”: Anatole Lécuyer (November)
  • Webinar from “Agence Régionale de Santé”: Mélanie Cogné (November)
  • Webinar from “Gref : GIP Relation Emploi-Formation - Training from Real to Virtual”: Bruno Arnaldi, Valérie Gouranton (december)

12 Scientific production

12.1 Major publications

  • 1 inproceedingsF.Ferran Argelaguet Sanz, L.Ludovic Hoyet, M.Michaël Trico and A.Anatole Lécuyer. The role of interaction in virtual embodiment: Effects of the virtual hand representation.IEEE Virtual RealityGreenville, United StatesMarch 2016, 3-10
  • 2 articleM.-S.Marie-Stéphanie Bracq, E.Estelle Michinov, B.Bruno Arnaldi, B.Benoît Caillaud, B.Bernard Gibaud, V.Valérie Gouranton and P.Pierre Jannin. Learning procedural skills with a virtual reality simulator An acceptability study.Nurse Education Today79August 2019, 153-160
  • 3 articleX.Xavier De Tinguy, C.Claudio Pacchierotti, A.Anatole Lécuyer and M.Maud Marchal. Capacitive Sensing for Improving Contact Rendering with Tangible Objects in VR.IEEE Transactions on Visualization and Computer GraphicsJanuary 2021
  • 4 articleM.Mathis Fleury, G.Giulia Lioi, C.Christian Barillot and A.Anatole Lécuyer. A Survey on the Use of Haptic Feedback for Brain-Computer Interfaces and Neurofeedback.Frontiers in Neuroscience1June 2020
  • 5 articleR.Rebecca Fribourg, F.Ferran Argelaguet Sanz, A.Anatole Lécuyer and L.Ludovic Hoyet. Avatar and Sense of Embodiment: Studying the Relative Preference Between Appearance, Control and Point of View.IEEE Transactions on Visualization and Computer Graphics265May 2020, 2062-2072
  • 6 articleR.Ronan Gaugne, F.Françoise Labaune-Jean, D.Dominique Fontaine, G.Gaétan Le Cloirec and V.Valérie Gouranton. From the engraved tablet to the digital tablet, history of a fifteenth century music score.Journal on Computing and Cultural Heritage1332020, 1-18
  • 7 articleF.Flavien Lécuyer, V.Valérie Gouranton, A.Aurélien Lamercerie, A.Adrien Reuzeau, B.Bruno Arnaldi and B.Benoît Caillaud. Unveiling the implicit knowledge, one scenario at a time.Visual Computer2020, 1-12
  • 8 inproceedingsF.Flavien Lécuyer, V.Valérie Gouranton, A.Adrien Reuzeau, R.Ronan Gaugne and B.Bruno Arnaldi. Create by doing - Action sequencing in VR.CGI 2019 - Computer Graphics International, Advances in Computer GraphicsCalgary, CanadaSpringer International PublishingJune 2019, 329-335
  • 9 articleV.Victor Mercado, M.Maud Marchal and A.Anatole Lécuyer. ENTROPiA: Towards Infinite Surface Haptic Displays in Virtual Reality Using Encountered-Type Rotating Props.IEEE Transactions on Visualization and Computer Graphics273March 2021, 2237-2243
  • 10 bookG.Guillaume Moreau, B.Bruno Arnaldi and P.Pascal Guitton. Virtual Reality, Augmented Reality: myths and realities.Computer engineering seriesISTEMarch 2018, 322
  • 11 articleT.Théophane Nicolas, R.Ronan Gaugne, C.Cédric Tavernier, Q.Quentin Petit, V.Valérie Gouranton and B.Bruno Arnaldi. Touching and interacting with inaccessible cultural heritage.Presence: Teleoperators and Virtual Environments2432015, 265-277
  • 12 inproceedingsE.Etienne Peillard, Y.Yuta Itoh, J.-M.Jean-Marie Normand, F.Ferran Argelaguet Sanz, G.Guillaume Moreau and A.Anatole Lécuyer. Can Retinal Projection Displays Improve Spatial Perception in Augmented Reality?ISMAR 2020 - 19th IEEE International Symposium on Mixed and Augmented Reality2020 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)Recife, BrazilIEEENovember 2020, 124-133
  • 13 inproceedingsE.Etienne Peillard, T.Thomas Thebaud, J.-M.Jean-Marie Normand, F.Ferran Argelaguet Sanz, G.Guillaume Moreau and A.Anatole Lécuyer. Virtual Objects Look Farther on the Sides: The Anisotropy of Distance Perception in Virtual Reality.VR 2019 - 26th IEEE Conference on Virtual Reality and 3D User InterfacesOsaka, JapanIEEEMarch 2019, 227-236
  • 14 articleH.Hakim Si-Mohammed, J.Jimmy Petit, C.Camille Jeunet, F.Ferran Argelaguet Sanz, F.Fabien Spindler, A.Andéol Evain, N.Nicolas Roussel, G.Géry Casiez and A.Anatole Lécuyer. Towards BCI-based Interfaces for Augmented Reality: Feasibility, Design and Evaluation.IEEE Transactions on Visualization and Computer Graphics263March 2020, 1608-1621

12.2 Publications of the year

International journals

  • 15 articleM.-S.Marie-Stéphanie Bracq, E.Estelle Michinov, M. L.Marie Le Duff, B.Bruno Arnaldi, V.Valérie Gouranton and P.Pierre Jannin. “Doctor, please”: Educating nurses to speak up with interactive digital simulation tablets..Clinical Simulation in Nursing54May 2021, 97-104
  • 16 articleM.-S.Marie-Stéphanie Bracq, E.Estelle Michinov, M.Marie Le Duff, B.Bruno Arnaldi, V.Valérie Gouranton and P.Pierre Jannin. Training situational awareness for scrub nurses: Error recognition in a virtual operating room.Nurse Education in Practice53May 2021, 1-10
  • 17 articleH.Hugo Brument, G.Gerd Bruder, M.Maud Marchal, A.-H.Anne-Hélène Olivier and F.Ferran Argelaguet Sanz. Understanding, Modeling and Simulating Unintended Positional Drift during Repetitive Steering Navigation Tasks in Virtual Reality.IEEE Transactions on Visualization and Computer Graphics2711November 2021, 4300-4310
  • 18 articleX.Xavier De Tinguy, C.Claudio Pacchierotti, A.Anatole Lécuyer and M.Maud Marchal. Capacitive Sensing for Improving Contact Rendering with Tangible Objects in VR.IEEE Transactions on Visualization and Computer Graphics274April 2021, 2481-2487
  • 19 articleA.Adélaïde Genay, A.Anatole Lécuyer and M.Martin Hachet. Being an Avatar "for Real": a Survey on Virtual Embodiment in Augmented Reality.IEEE Transactions on Visualization and Computer Graphics2021, 1 - 20
  • 20 articleA.Adélaïde Genay, A.Anatole Lécuyer and M.Martin Hachet. Virtual, Real or Mixed: How Surrounding Objects Influence the Sense of Embodiment in Optical See-Through Experiences?Frontiers in Virtual Reality2June 2021, 1-15
  • 21 articleP.Panagiotis Kourtesis, S.Simona Collina, L. A.Leonidas A. A. Doumas and S. E.Sarah E. MacPherson. An ecologically valid examination of event-based and time-based prospective memory using immersive virtual reality: the effects of delay and task type on everyday prospective memory.Memory294April 2021, 486-506
  • 22 articleP.Panagiotis Kourtesis, S.Simona Collina, L. .Leonidas A.A. Doumas and S. E.Sarah E MacPherson. Validation of the Virtual Reality Everyday Assessment Lab (VR-EAL): An immersive virtual reality neuropsychological battery with enhanced ecological validity.Journal of the International Neuropsychological Society272February 2021, 181-196
  • 23 articleP.Panagiotis Kourtesis and S. E.Sarah E MacPherson. An ecologically valid examination of event-based and time-based prospective memory using immersive virtual reality: The influence of attention, memory, and executive function processes on real-world prospective memory.Neuropsychological RehabilitationDecember 2021, 1-26
  • 24 articleP.Panagiotis Kourtesis and S. E.Sarah E MacPherson. How immersive virtual reality methods may meet the criteria of the National Academy of Neuropsychology and American Academy of Clinical Neuropsychology: A software review of the Virtual Reality Everyday Assessment Lab (VR-EAL).Computers in Human Behavior Reports4August 2021, 1-14
  • 25 articleS.Salomé Le Franc, M.Mathis Fleury, C.Camille Jeunet, S.Simon Butet, C.Christian Barillot, I.Isabelle Bonan, M. Lanie Cogné and A.Anatole Lé Cuyer. Influence of the visuo-proprioceptive illusion of movement and motor imagery of the wrist on EEG cortical excitability among healthy participants.PLoS ONE1692021, 1-19
  • 26 articleI.Isabelle Le Goff, M.Myriam Texier-Le Puil, T.Théophane Nicolas, J.-B.Jean-Baptiste Barreau and R.Ronan Gaugne. Perishable containers in the context of cremation: contribution of computed tomography.Bulletin de l'Association pour la Promotion des Recherches sur l'Âge du Bronze19January 2021, 1-5
  • 27 articleA.Anatole Lécuyer, S.Salomé Le Franc, I.Isabelle Bonan, M.Mathis Fleury, S.Simon Butet, C.Christian Barillot and M.Mélanie Cogné. Visual feedback improves movement illusions induced by tendon vibration after chronic stroke.Journal of NeuroEngineering and Rehabilitation181December 2021, 156
  • 28 articleG.Giulia Lioi, A.Adolfo Veliz, J.Julie Coloigner, Q.Quentin Duché, S.Simon Butet, M.Mathis Fleury, E.Emilie Leveque-Le Bars, E.Elise Bannier, A.Anatole Lécuyer, C.Christian Barillot and I.Isabelle Bonan. The impact of Neurofeedback on effective connectivity networks in chronic stroke patients: an exploratory study.Journal of Neural Engineering185September 2021, 056052
  • 29 articleT.Tiffany Luong, A.Anatole Lécuyer, N.Nicolas Martin and F.Ferran Argelaguet Sanz. A Survey on Affective and Cognitive VR.IEEE Transactions on Visualization and Computer GraphicsSeptember 2021, 1-20
  • 30 articleV.Victor Mercado, M.Maud Marchal and A.Anatole Lécuyer. ENTROPiA: Towards Infinite Surface Haptic Displays in Virtual Reality Using Encountered-Type Rotating Props.IEEE Transactions on Visualization and Computer Graphics273March 2021, 2237-2243
  • 31 articleV. R.Victor Rodrigo Mercado, M.Maud Marchal and A.Anatole Lécuyer. “Haptics On-Demand”: A Survey on Encountered-Type Haptic Displays.IEEE Transactions on Haptics (ToH)143July 2021, 449-464
  • 32 articleN.Niki Panopoulou, F.Foteini Christidi, P.Panagiotis Kourtesis, P.Panagiotis Ferentinos, P.Panagiota Karampetsou, G.Georgios Tsirtsiridis, T.Thomas Theodosiou, S.Sofia Xirou, V.Vasiliki Zouvelou, I.Ioannis Evdokimidis, M.Michail Rentzos and I.Ioannis Zalonis. The association of theory of mind with language and visuospatial abilities in amyotrophic lateral sclerosis: a pilot study.Amyotrophic Lateral Sclerosis and Frontotemporal DegenerationDecember 2021, 1-8

International peer-reviewed conferences

  • 33 inproceedingsH.Hugo Brument, M.Maud Marchal, A.-H.Anne-Hélène Olivier and F.Ferran Argelaguet Sanz. Studying the Influence of Translational and Rotational Motion on the Perception of Rotation Gains in Virtual Environments.SUI 2021 - Symposium on Spatial User InteractionVirtual Event, United StatesNovember 2021, 1-12
  • 34 inproceedingsD.Diane Dewez, L.Ludovic Hoyet, A.Anatole Lécuyer and F.Ferran Argelaguet Sanz. Towards “Avatar-Friendly” 3D Manipulation Techniques: Bridging the Gap Between Sense of Embodiment and Interaction in Virtual Reality.CHI 2021 - Conference on Human Factors in Computing SystemsYokohama, JapanACMMay 2021, 1-14
  • 35 inproceedingsR.Rafik Drissi, R.Ronan Gaugne, T.Théophane Nicolas and V.Valérie Gouranton. Immersive Volumetric Point Cloud Manipulation for Cultural Heritage.ICAT-EGVE 2021 - 31th International Conference on Artificial Reality and Telexistence (ICAT 2021) and the 26th Eurographics Symposium on Virtual Environments (EGVE 2021)International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual EnvironmentsSankt Augustin, GermanySeptember 2021, 1-5
  • 36 inproceedingsR.Ronan Gaugne, T.Théophane Nicolas, J.Julie Priser, M.Marion Lemaire, D.Dominique Bernard, M.Meven Leblanc, J.Jim Pavan, T.Tom Roy, R.Raphael Dupont, J.Julie Ayoubi, M.-M.Marie-Marie Le Cornec and V.Valérie Gouranton. Interaction avec la sirène de Cagniard-Latour en réalité virtuelle.Humanistica. (2021). Colloque Humanistica 2021 - Recueil des résumés. Humanistica 2021, Rennes. Zenodo. https://doi.org/10.5281/zenodo.4745006Humanistica 2021 - Colloque de l'Association francophone des humanités numériquesRennes, FranceMay 2021, 1-4
  • 37 inproceedingsJ.Jiayi Hong, F.Ferran Argelaguet Sanz, A.Alain Trubuil and T.Tobias Isenberg. Design and Evaluation of Three Selection Techniques for Tightly Packed 3D Objects in Cell Lineage Specification in Botany.GI 2021 - Graphics InterfaceMississauga, CanadaMay 2021, 213--223
  • 38 inproceedingsT.Thomas Howard, G.Guillaume Gicquel, M.Maud Marchal, A.Anatole Lécuyer and C.Claudio Pacchierotti. PUMAH : Pan-tilt Ultrasound Mid-Air Haptics.WHC 2021 - IEEE World Haptics ConferenceMontréal / Virtual, CanadaJuly 2021, 1
  • 39 inproceedingsV. R.Victor Rodrigo Mercado, T.Thomas Howard, H.Hakim Si-Mohammed, F.Ferran Argelaguet Sanz and A.Anatole Lécuyer. Alfred: the Haptic Butler On-Demand Tangibles for Object Manipulation in Virtual Reality using an ETHD.WHC 2021 IEEE World Haptics ConferenceMontreal, FranceIEEEJuly 2021, 373-378
  • 40 inproceedingsG.Guillaume Vailland, L.Louise Devigne, F.François Pasteau, F.Florian Nouviale, B.Bastien Fraudet, E.Emilie Leblong, M.Marie Babel and V.Valérie Gouranton. VR based Power Wheelchair Simulator: Usability Evaluation through a Clinically Validated Task with Regular Users.VR 2021 - IEEE Conference on Virtual Reality and 3D User InterfacesLisbon, PortugalIEEEMarch 2021, 1-8
  • 41 inproceedingsG.Guillaume Vailland, V.Valérie Gouranton and M.Marie Babel. Cubic Bézier Local Path Planner for Non-holonomic Feasible and Comfortable Path Generation.ICRA 2021 - IEEE International Conference on Robotics and AutomationXian, ChinaIEEEMay 2021, 7894-7900
  • 42 inproceedingsS.Sebastian Vizcay, P.Panagiotis Kourtesis, F.Ferran Argelaguet Sanz, C.Claudio Pacchierotti and M.Maud Marchal. Electrotactile Feedback For Enhancing Contact Information in Virtual Reality.ICAT-EGVE 2021 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual EnvironmentsSankt Augustin, GermanySeptember 2021

Conferences without proceedings

  • 43 inproceedingsL.Lysa Gramoli, J.Jérémy Lacoche, A.Anthony Foulonneau, V.Valérie Gouranton and B.Bruno Arnaldi. Needs Model for an Autonomous Agent during Long-term Simulations.4th IEEE International Conference on Artificial Intelligence and Virtual Reality, AIVR - workshop MARCHTaichung, TaiwanIEEE2021, 1-5

Doctoral dissertations and habilitation theses

  • 44 thesisM.Mathis Fleury. Multimodal neurofeedback based on EEG/fMRI imaging techniques and visuo-haptic feedback for stroke rehabilitation.Université Rennes 1February 2021

Other scientific publications

  • 45 inproceedingsT.Thomas Howard, X.Xavier De Tinguy, G.Guillaume Gicquel, M.Maud Marchal, A.Anatole Lécuyer and C.Claudio Pacchierotti. WeATaViX: Wearable Actuated Tangibles for Virtual Reality Experiences.WHC 2021 - IEEE World Haptics ConferenceMontréal / Virtual, CanadaJuly 2021, 1

12.3 Cited publications

  • 46 bookD. A.Doug A Bowman, E.Ernest Kruijff, J. J.Joseph J LaViola and I.Ivan Poupyrev. 3D User Interfaces: Theory and Practice.Addison Wesley2004
  • 47 articleA.Anatole Lécuyer. Simulating Haptic Feedback Using Vision: A Survey of Research and Applications of Pseudo-Haptic Feedback.Presence: Teleoperators and Virtual Environments1812009, 39--53