FR

EN

Homepage Inria website
  • Inria login
  • The Inria's Research Teams produce an annual Activity Report presenting their activities and their results of the year. These reports include the team members, the scientific program, the software developed by the team and the new results of the year. The report also describes the grants, contracts and the activities of dissemination and teaching. Finally, the report gives the list of publications of the year.

  • Legal notice
  • Cookie management
  • Personal data
  • Cookies


Section: New Results

Web Reactive Programming

During the year, we have continued our effort in designing and implementing the HipHop.js programming language, we have applied it to interactive music composition, and we have studied security of reactive systems.

HipHop.js

Web applications react to many sort of events. Let them be GUI events, multimedia events, or network events on client code or IO and system events on the server, they are all triggered asynchronously. JavaScript, the hegemonic programming language of the Web, handles them using low level constructs based on listeners, a synonym for callback. To improve on the so-called callback hell, the recent versions of the language have proposed new constructs that raise the programming abstraction level (promises and async /await ). They enable a programming style, closer to traditional sequential programming, which helps developing and maintaining applications. However, the improvements they propose rely exclusively on syntactic extensions. They do not change the programming model. For that reason, complex orchestration problems that imply all sorts of synchronization, preemption, and parallelism remain as complex to program as before. We think that orchestration should be reconsidered more globally and from the ground. The solution we propose consists in embedding a DSL specialized on orchestration inside the traditional Web development environment, in our case, Hop.js, the Web programming language that the team develop.

The orchestration DSL we propose is called HipHop.js. It is a reactive synchronous language. More precisely, it is an adaptation of the Esterel programming language to the Web. The motivations for choosing Esterel are diverse. First, and most important, Esterel is powerful enough to handle all the orchestration patterns we are considering. Second, the team, via its partnership with Colège de France, has high expertise in the design and development of Esterel-like languages, which constitutes a highly valuable asset for our development.

Esterel is powerful enough to handle all the orchestration patterns we are considering but Esterel has been designed and developed in a context baring no resemblance with the Web. Esterel was considering static execution models while the Web assumes permanent evolutions and modifications of the running programs. Esterel was considering sequential imperative languages for its embedding, while the Web is considering dynamic functional languages (i.e., JavaScript). Esterel was assuming static execution contexts where a-priori validity proof were enforced before hand while the Web assumes highly dynamic runtime executions so that only dynamic verifications are doable. For all these reasons, adapting Esterel and transforming it to form HipHop.js has needed a deep revamping and a deep paradigm shift.

During the year of 2018, we have finalized and completed the design of the language that is now almost stabilized. It follows previous version developed in C. Vidal's PhD studies [13], [18]. The version 0.3.x has been made available at the URL http://hop-dev.inria.fr. It has been used to implement our first orchestration demanding applications, in particular, an interactive music composition application. Our next steps will consist in completing the design and implementation of the language and a minimal development environment without which only experts can use the system. We of course also need to publicize the system and describe its design and internal in various academic publications.

Interactive music composition: the Skini platform

In the sixties, the philosopher Umberto Eco, and musicians such as K. Stockausen, K. Penderescki, L. Berio questioned about the relationship between composers, musicians, and the way we perceive music. Eco used the wording "Open Work", and showed that, the vision of the world evolved from a static world to a more blurred perception. According to this new perception and in a shift comparable to the evolution of physics from Copernic to Einstein, some contemporary artists tried to express this complexity through works where the performer and the audience have a concrete impact on the work. Since the sixties, the development of audience participation for collaborative music production has become a more and more active field. Thanks to the large device market and web based technology development such as web audio API, "Open Work" got a broader meaning with systems allowing individual interaction. Nevertheless it is still difficult to find systems proposing frameworks dedicated to music composition of interactive performances with a clear composition scheme and ease of use. This is our motivation for developing a framework, called Skini, designed for composing, simulating, and executing interactive performances. Skini is based on elementary music patterns, automatic control of the patterns activation made possible thanks to Hop and Hiphop. Skini was first used for a concert that took place at the very end of 2017 in the contemporary Musical Festival of Nice (MANCA) followed in 2018 by performances during the "Portes ouvertes" of Inria, the "Fête de la Science" and the Synchron conference.

In 2018: The Skini's user interface has been revamped. We have tried several interfaces for the pattern activation and focused on a simple one in order to make the interface more intuitive and fluid. We have added an important feature called the "distributed sequencer", which allows the audience not only to activate patterns but also to create them. We have added a new level of interaction, the scrutator, which allows global actions by the audience on the orchestration. The complete system is now synchronized with an external Midi clock. We have developed a first version of stand alone Midi control of the pattern. The system has followed the evolution of the Hiphop syntax and now implements the last version for the control of orchestration. We improved the synchronisation system and the processes for implementing the orchestration.