EN FR
EN FR


Section: Highlights of the Year

Highlights of the Year

  • In collaboration with several partners, PERCEPTION completed the three year EU STREP project EARS (2014-2017). PERCEPTION contributed to audio-source localization using microphone arrays and to the disambiguation of audio information using vision, in particular to discriminate between speaking and silent persons.

    Website: https://robot-ears.eu/

  • PERCEPTION started and completed a one year collaboration (December 2016 – November 2017) with Samsung Electronics Digital Media and Communications R&D Center, Seoul, Korea. The topic of this collaboration, fully funded by Samsung, was multi-modal methodologies for human-robot interaction (a central topic of the team) and is part of a strategic partnership between Inria and Samsung Electronics. A follow-up of this collaboration is under preparation and it is planned to start soon (February 2018).

  • As an ERC Advanced Grant holder, Radu Horaud was awarded a Proof of Concept grant for his project Vision and Hearing in Action Laboratory (VHIALab). The project will develop software packages enabling companion robots to robustly interact with multiple users.

    Website: https://team.inria.fr/perception/projects/poc-vhialab/

Awards

  • Israel Dejene Gebru (PhD student) and his co-authors, Christine Evers, Patrick Naylor (both from Imperial College London) and Radu Horaud, received the best paper award at the IEEE Fifth Joint Workshop on Hands-free Speech Communication and Microphone Arrays, San Francisco, USA, 1-3 March 2017, for their paper Audio-visual Tracking by Density Approximation in a Sequential Bayesian Filtering Framework.

  • Yutong Ban (PhD student) and his co-authors, Xavier Alameda-Pineda, Fabien Badeig, and Radu Horaud, were among the five finalists of the “Novel Technology Paper Award for Amusement Culture” at the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vancouver, Canada, September 2017, for their paper Tracking a Varying Number of People with a Visually-Controlled Robotic Head.

Best Papers Awards:
[41]
I. Gebru, C. Evers, P. Naylor, R. Horaud.

Audio-visual Tracking by Density Approximation in a Sequential Bayesian Filtering Framework, in: IEEE Workshop on Hands-free Speech Communication and Microphone Arrays, San Francisco, CA, United States, IEEE Signal Processing Society, March 2017, Best Paper Award. [ DOI : 10.1109/HSCMA.2017.7895564 ]

https://hal.inria.fr/hal-01452167


[38]
Y. Ban, X. Alameda-Pineda, F. Badeig, S. Ba, R. Horaud.

Tracking a Varying Number of People with a Visually-Controlled Robotic Head, in: IEEE/RSJ International Conference on Intelligent Robots and Systems, Vancouver, Canada, September 2017.

https://hal.inria.fr/hal-01542987