Project-team Calligramme's aim is the development of tools and methods that stem from proof theory, and in particular, linear logic. Two fields of application are emphasized: in the area of computational linguistics, the modelling of the syntax and semantics of natural languages; in the area of software engineering the study of the termination and complexity of programs.

Project-team Calligramme's research is conducted at the juncture of
mathematical logic and computer science. The scientific domains that
base our investigations are proof theory and the

These rules have important logical weight: the weakening rule embodies the fact that some hypotheses may be dropped during a derivation; in a similar fashion the contraction rule specifies that any hypothesis can be used an unlimited number of times; as for the exchange rule it stipulates that no order of priority holds between hypotheses. Thus, the presence of the structural rules in the ordinary sequent calculus strongly conditions the properties of the logic that results. For example, in the Gentzen-style formulations of classical or intuitionistic logic, the contraction rule by itself entails the undecidability of the predicate calculus. In the same manner, the use of the weakening and contraction rules in the right half of the sequent in classical logic is responsible for the latter's non-constructive aspects.

According to this analysis, linear logic can be understood as a system that conciliates the constructivist aspect of intuitionistic logic and the symmetry of classical logic. As in intuitionistic logic the constructive character comes from the banning of the weakening and contraction rules in the right part of the sequent. But simultaneously, in order to preserve symmetry in the system, the same rules are also rejected in the other half.

The resulting system, called rudimentary linear logic

presents many interesting properties. It is endowed with four logical connectors (two conjuctions and two disjunctions) and the four constants that are their corresponding units. It is completely symmetrical, although constructive, and equipped with an involutive negation. As a consequence, rules similar to De Morgan's law hold in it.

In rudimentary linear logic, any hypothesis must be used once and
only once during a derivation. This property, that allows linar
logic to be considered as a resource calculus, is due, as we have
seen, to the rejection of structural rules. But their total
absence also implies that rudimentary linear logic is a much
weaker system than intuitionistic or classical logic. Therefore,
in order to restore its strenght it is necessary to augment the
system with operators that recover the logical power of the
weakening and contraction rules. This is done via two modalities
that give tightly controlled access to the structural rules.
Thus, linear logic does not question the usefulness of the
structural rules, but instead, emphasizes their logical
importance. In fact, it rejects them as epitheoretical
rules

The finer decomposition that linear logic brings to traditional
logic has another consequence: the Exchange rule, which so far has
been left as is, is now in a quite different position, being in
the only one of the traditional structural rules that is left. A
natural extension of Girard's original program is to investigate
its meaning, in other words, to see what happens to the rest of
the logic when Exchange is tampered with. Two standard algebraic laws
are contained in it: commutativity and associativity. Relaxing
these rules entails looking for non-commutative, and
non-associative, variants of linear logic; there are now several
examples of these. The natural outcome of this proliferation is a
questioning of the nature of the structure that binds formulas
together in a sequent: what is the natural general replacement of
the notion of (multi) set, as applied to logic? Such questions are
important for Calligramme and are addressed, for example,
in

The activities of project-team Calligramme are organized around three research actions:

Proof nets, sequent calculus and typed

$\lambda $ -calculi;Grammatical formalisms;

Implicit complexity of computations.

The first one of these is essentially theoretical, the other two, presenting both a theoretical and an applied character, are our privileged fields of application.

The aim of this action is the development of the theoretical tools
that we use in our other research actions. We are interested, in
particular, in the notion of formal proof itself, as much from a
syntactical point of view (sequential derivations, proof nets,

Proof nets are graphical representations (in the sense of graph
theory) of proofs in linear logic. Their role is very similar to
lambda terms for more traditional logics; as a matter of fact
there are several back-and-forth translations that relate several
classes of lambda terms with classes of proof nets. In addition to
their strong geometric character, another difference between proof
nets and lambda terms is that the proof net structure of a proof
of formula added to correctness criterion..

The discovery of new correctness criteria remains an important research problem, as much for Girard's original linear logic as for the field of non-commutative logics. Some criteria are better adapted to some applications than others. In particular, in the case of automatic proof search, correctness criteria can be used as invariants during the inductive process of proof construction.

The theory of proof nets also presents a dynamic character: cut
elimination. This embodies a notion of normalization (or
evaluation) akin to

As we said above, until the invention of proof nets, the principal
tool for representing proofs in constructive logics was the

Although the Curry-Howard isomorphism owes its existence to the
functional character of intuitionistic logic, it can be extended
to fragments of classical logic. It turns out that some
constructions that one meets in functional progamming languages,
such as control operators, can presently only be explained by the
use of deduction rules that are related to proof by
contradiction

This extension of the Curry-Howard isomorphism to classical logic and its applications has a perennial place as research field in the project.

Lambek's syntactic calculus, which plays a central part in the
theory of categorial grammars, can be seen a posteriori
as a fragment of linear logic. As a matter of fact it introduces a
mathematical framework that enables extensions of Lambek's
original calculus as well as extensions of categorial grammars in
general. The aim of this work is the development of a model, in
the sense of computational linguistics, which is more flexible and
efficient than the presently existing categorial models.

The relevance of linear logic for natural language processing is
due to the notion of resource sensivity. A language (natural or
formal) can indeed be interpreted as a system of resources. For
example a sentence like The man that Mary saw Peter
slept is incorrect because it violates an underlying principle
of natural languages, according to which verbal valencies must be
realized once and only once. Categorial grammars formalize this idea
by specifying that a verb such as saw is a resource which will give
a sentence

where the slash (modus ponens law which turns out to be
Bar-Hillel's simplification scheme

but also the introduction rules:

The Lambek calculus does have its own limitations. Among other things it cannot treat syntactical phenomena like medial extraction and crossed dependencies. Thus the question arises: how can we extend the Lambek calculus to treat these and related problems? This is where linear logic comes into play, by offering an adequate mathematical framework for attacking this question. In particular proof nets appear as the best adapted approach to syntactical structure in the categorial framework.

Proof nets offer a geometrical interpretation of proof construction.
Premises are represented by proof net fragments with inputs and outputs
which respectively model needed and offered resources. These fragments
must then be combined by pairing inputs and outputs according to their
types. This process can also be interpreted in a model-theoretical
fashion where fragments are regarded as descriptions for certain class
of models: the intuitionistic multiplicative fragment of linear
logic can be interpreted on directed acyclic graphs, while for the
implicative fragment, trees suffice

This perspective shift from proof theory to model theory remains founded on the notion of resource sensitivity (e.g. in the form of polarities and their neutralization) but affords us the freedom to interpret these ideas in richer classes of models and leads to the formalism of Interaction Grammars. For example:

where previously we only considered simple categories with polarities, we can now consider complex categories with polarized features.

we can also adopt more expressive tree description languages that allow us to speak about dominance and precedence relations between nodes. In this fashion we espouse and generalize the monotonic version of Tree Adjoining Grammars (TAG) as proposed by Vijay-Shanker

. contrary to TAG where tree fragments can only be inserted, Interaction Grammars admit models where the interpretations of description fragments may overlap.

Another grammatical framework which embraces both the notion of resource sensitivity and the interpretational perspective of model theory is dependency grammar.

Dependency grammar is predicated on the notion of an asymmetrical
relation of (syntactic or semantic) dependency. This analytical
idea has a very long history dating back at least to Panini (450
BC) and the Ancient logicians and philosophers, and made its way
into european medieval linguistics under the spreading influence
of the Arabic linguistic tradition. The modern notion of
dependency grammar is usually attributed to
Tesnière

The main formal notions of dependency grammar that reflect and embody its sensitivity to resources are subcategorization (sensitivity to syntactic resources) and valency (sensitivity to semantic resources). It should be noted that these core DG concepts of head/dependent asymmetry and subcategorization/valency have been adopted by other grammatical formalisms. In the categorial grammar tradition, these notions appear respectively as directional functional application and categorial types. In HPSG, they give rise to the notion of headed structures, head daughters, and SUBCAT lists.

Dependency grammar permits non-projective analyses, i.e. where branches may cross. For this reason, it holds a special appeal for languages with free or freer word-order than French or English, such as German, Russian, Czech... and is certainly one reason for the strong renewal of interest in DG in recent years.

Duchier

Extending this approach, Duchier and
Debusmann

The construction of software which is certified with respect to its specifications is more than ever a great necessity. It is crucial to ensure, while developing a certified program, the quality of the implementation in terms of efficiency and computational resources. Implicit complexity is an approach to the analysis of the resources that are used by a program. Its tools come essentially from proof theory. The aim is to compile a program while certifying its complexity.

The meta-theory of programming traditionally answers questions
with respect to a specification, like termination. These
properties all happen to be extensional, that is, described
purely in terms of the relation between the input of the program
and its output. However, other properties, like the efficiency of
a program and the resources that are used to effect a computation,
are excluded from this methodology. The reason for this is
inherent to the nature of the questions that are posed. In the
first case we are treating extensional properties, while in the
second case we are inquiring about the manner in which a
computation is effected. Thus, we are
interested in intensional properties of programs.

The complexity of a program is a measure of the resources that are necessary for its execution. The resources taken into account are usually time and space. The theory of complexity studies the problems and the functions that are computable given a certain amount of resources. One should not identify the complexity of functions with the complexity of programs, since a function can be implemented by several programs. Some are efficient, others are not.

One achievement of complexity theory is the ability to tell the
``programming expert'' the limits of his art, whatever the amount of
gigabytes and megaflops that are available to him. Another achievement
is the development of a mathematical model of algorithmic complexity.
But when facing these models the programming expert is often
flabbergasted. There are several reasons for this; let us illustrate
the problem with two examples. The linear acceleration theorem states
that any program which can be executed in time

The need to reason on programs is a relevant issue in the process of software development. The certification of a program is an essential property, but it is not the only one. Showing the termination of a program that has exponential complexity does not make sense with respect to our reality. Thus arises the need to construct tools for reasoning on algorithms. The theory of implicit complexity of computations takes a vast project to task, namely the analysis of the complexity of algorithms.

Abstract Categorial Grammars (ACGs) are a new categorial formalism based on Girard's linear logic. This formalism, which sticks to the spirit of current type-logical grammars, offers the following features:

Any ACG generates two languages, an abstract language and an object language. The abstract language may be thought as a set of abstract grammatical structures, and the object language as the set of concrete forms generated from these abstract structures. Consequently, one has a direct control on the parse structures of the grammar.

The langages generated by the ACGs are sets of linear

$\lambda $ -terms. This may be seen as a generalization of both string-langages and tree-langages.ACGs are based on a small set of mathematical primitives that combine via simple composition rules. Consequently, the ACG framework is rather flexible.

Abstract categorial grammars are not intended as yet another grammatical formalism that would compete with other established formalisms. It should rather be seen as the kernel of a grammatical framework in which other existing grammatical models may be encoded.

Interaction Grammars (IGs) are a linguistic formalism that aims at modelling both the syntax and the semantics of natural languages according to the following principles:

An IG is a monotonic system of constraints, as opposed to a derivational/transformational system, and this system is multidimensional: at the syntactic level, basic objects are tree descriptions and at the semantic level, basic objects are Directed Acyclic Graph descriptions.

The synchronization between the syntactic and the semantic levels is realized in a flexible way by a partial function that maps syntactic nodes to semantic nodes.

Much in the spirit of Categorial Grammars, the resource sensitivity of natural language is built-in in the formalism: syntactic composition is driven by an operation of cancellation between polarized morpho-syntactic features and in parallel, semantic composition is driven by a similar operation of cancellation between polarized semantic features.

The formalism of IG stems from a reformulation of proof nets of
Intuitionisitc Linear Logic (which have very specific properties)
in a model-theoretical framework

Dependency grammar (DG) is a resource sensitive grammar formalism
related to categorial grammar. An important appeal of DG is that
it naturally permits non-projective analyses and is thus
especially well suited for languages with free or freer word-order
(e.g. Czech, German, Russian, etc...). However, free(r)
word-order poses an especially thorny challenge for processing:
all efficient parsing algorithms rely fundamentally on properties
of adjacency and projectivity. Duchier

Of course, not all languages have free(r) word-order, and even for
those, word-order is not really entirely free. The next challenge
for DG was to permit an account of word-order. While this had
been attempted in the past, no proposal to date was especially
appealing. Duchier and Debusmann

Every lexicalized grammar formalism (tree adjoining grammar, categorial grammar, interaction grammar, dependency grammar) is faced with the problem of organizing and structuring the lexicon. On the one hand, there is the pragmatic consideration that the lexicon should be modularly organized so as to make it easier to develop, maintain and extend. On the other hand it should also be possible to express linguistic generalizations (e.g. passivisation schemata).

Marie-Hélène Candito proposed and developed a meta-grammatical approach for lexicalized TAGs (Tree Adjoining Grammars) that generated a lot of interest. Unfortunately, it also exhibited a number of infelicitous properties, in particular difficulties in managing named entities and implicit crossings.

The theory of implicit complexity is quite new and there are still many things to do. So, it is really important to translate current theoretical tools into real applications; this should allow to validate and guide our hypotheses. In order to do so, three directions are being explored.

First order functional programming. A first prototype, called

Icarhas been developed and should be integrated intoElan(http://elan.loria.fr ).Extracting programs from proofs. Here, one should build logical theories in which programs extracted via the Curry-Howard isomorphism are efficient.

Application to mobile code system. This work starts in collaboration with the

INRIACrystal and Mimosa project-teams.

Leopar (formerly known as Agir) is a parser for
natural languages which is based on the formalism of Interaction
Grammars (IG)

Parsing a sentence with an interaction grammar consists in first
selecting a lexical entry for each of its words, then in merging
all selected descriptions—tree descriptions à la
Vijay-Shanker—into a unique one which represents a syntactic
description of the sentence. The criterion for success is that
this ultimate description is a neutral tree description. As
IG are based on under-specified trees, Leopar uses some
specific and non-trivial data-structures and algorithms.

The electrostatic principle has been intensively considered in
Leopar. The theoretical problem of parsing IGs is NP-complete;
the indeterminism usually associated to NP-completeness is present
at two levels: when a description for each word is selected from
the lexicon, and when a choice of what nodes to merge is made.
Polarities have shown their efficiency in pruning the search tree
for these two steps. They are used in the first, tagging phase of
the analysis by the additional constraint of allowing only
globally neutral selections. In the second, node-merging phase,
polarities are used to cut off parsing branches whose trees
contain too many uncancelled polarities.

Leopar's first release is available on the web at

The XDG software provides parsing and grammar development facilities for eXtendable Dependency Grammar. XDG is a meta-formalism that supports the declarative specification of typed multi-dimensional lexicalized DG formalisms. XDG provides the computational linguist with a simple and convenient way to assemble his own dependency grammar formalism declaratively. As a result of such a specification, he obtains for free (1) a concrete meta-grammar formalism appropriate to his design, and (2) a constraint-based parser and workbench.

XDG was written by Denys Duchier and Ralph Debusmann. The current
version of the software is available at

The metagrammar workbench offers a metagrammar formalism, compiler, and graphical visualizer for lexicalized TAGs. It currently supports a syntactic dimension adapted to Benoît Crabbé's TAG descriptions and a semantic dimension adapted to the hole semantics used by Claire Gardent, and will be extended to support Guy Perrier's Interaction Grammars. The software was written by Denys Duchier and Yannick Parmentier.

As a first stab at building free appropriate lexical resources,
Nicolas Barth, second year student at École des Mines, during a
training stint in Calligramme, developped an application that
computes the flexed forms of French verbs from their infinitive form
and derivation template. As of now, about 300,000 flexed forms can
be generated automatically, for 6,500 verbs. The program is
accessible at

The

In

Furthermore, Sylvain Salvati has studied the case of pattern
matching in the linear

For some time François Lamarche has been working on a unifying
framework for the proof theory of resource-sensitive logics; the
first realization in this program is the definition of the
concepts of structad and fibrational theory

But several problems he encountered made him realize that this had
to be pushed further: the ``contraction-like'' correctness
criteria discovered by Puite

He realized that not only was there the need for a very general
concept of term-rewriting system, that would include terms,
graphs, related higher dimensional objects (like derivations),
with the possibility of adding equations between these
(``rewriting modulo''), but that the many hints of an underlying
geometric nature could (and therefore had to) be made
manifest. This has led, first, to a scientific
manifesto

Since G. Perrier has shown the interest of Interaction Grammars
(IGs) in

First of all, the theoretical problem of parsing IGs is NP-complete. So, an important problem was to find techniques that prune as early as possible the branches of the search tree that are bound to fail. They developed a new parsing process called ``electrostatic parsing''.

Electrostatic parsing is based on the notion of polarity, one of
the defining features of Interaction Grammars.
In

For the first step, the tagging level, the principle is to filter only taggings which are globally neutral. The use of automata allows them to manipulate sets of taggings and enhances the computational efficiency.

For the second step, node merging, they consider the fact that
when reading a sentences, people use a very small amount of memory
to parse it. From the point of view of IG, this can be translated
as saying that one may force a predetermined bound on the number
of active (polarized) constituents. With Leopar, the tests done
so far show that this number is actually very small, at most 7 for
``natural'' sentences.

Initially, IGs were devoted to the syntax of natural languages but
G. Perrier extended them to semantics by adding a new level to the
syntactic one

With Topological Dependency Grammar (TDG)

Next, in order to tackle semantics, and in addition to the ID and LP trees, it became necessary to add a so-called tectogrammatical dimension to represent the predicate/argument structure. This suggested that we should embrace a more general multi-dimensional architecture where macroscopic linguistic phenomena arise from (1) simple lexical constraints simultaneouly constraining all dimensions, and (2) from principles stipulating how dimensions should be related. Furthermore, it should be possible to easily add new dimensions and new principles relating them.

This is what eXtendable Dependency Grammar (XDG) is about. XDG is a meta-formalism that supports the declarative specification of typed multi-dimensional lexicalized DG formalisms. XDG provides the computational linguist with a simple and convenient way to assemble his own dependency grammar formalism declaratively. As a result of such a specification, he obtains for free (1) a concrete meta-grammar formalism appropriate to his design, and (2) a constraint-based parser and workbench.

Ralph Debusmann is implementing XDG as part of his doctoral work. XDG has already been used to create a few DG instances: first, TDG was recreated, but this time we were able to take advantage of the meta-grammatical facilities provided by XDG to simplify and linguistically improve the lexicon. Second, we created STDG (Semantic Topological Dependency Grammar) which extends TDG with a syntax/semantics interface using, as additional dimensions, a tectogrammatical DAG for the predicate/argument structure, and a derivation tree for assembling the semantic logical form.

Kruijff and Duchier

Meanwhile, XDG has become one of the components in a new project dedicated to the development of a preference architecture for processing. The goal is to make it possible to combine multiple sources of information to guide processing. There are several research groups involved in this effort. One (CHORUS) is more particularly interested in semantic preferences. Another (NEGRA) is more concerned with wide-coverage grammars automatically induced from corpora. For the latter, the integration of multiple sources of information is especially critical. XDG's constraint-based approach make it particularly well-suited for integrating fragmentary bits of information and drawing the most out of it by propagation.

The study and the development of ACGs has two main current topics:
their expressive power and computational complexity for parsing.
Philippe de Groote and Sylvain Pogodalla proved that ACGs can
encode

The result obtained in

For learning regular tree languages as a model of natural language
acquisition, Jérome Besombes and jean-Yves Marion studied a
paradigm of exact learning from positive data and interactions
with an Oracle. The Oracle is a person on contact with the child,
who knows the language (a parent) and is able to reply to
membership queries to the target language submitted by the
apprentice. A polynomial algorithm based on this paradigm has been
constructed (

Jérome Besombes and Jean-Yves Marion defined a class of categorial
grammars in the spirit of Angluin

Denys Duchier organized a workgroup on the topic of metagrammars
involving both Calligramme and Langue et Dialogue:

This resulted in a collaboration between the two projects to
develop a new meta-grammar approach: Benoît Crabbé presented a new
way to decompose syntactic paraphrases into tree description
fragments decorated with resource annotations, and Claire Gardent
described the requirements of the syntax/semantics interface for
her application to hole semantics. Denys Duchier designed a new
(multi-dimensional) formalism combining both inheritance and
disjunction and where paraphrases are assembled by a
constraint-based solver inspired by the dominance constraint
solver of Duchier and Gardent

We are now working on extending this approach for Guy Perrier's Interaction Grammars. The goal is to arrive at a meta-formalism which can be suitably instantiated for various grammar formalisms, in particular for TAGs and Interaction Grammars.

Dominance constraints are logical description of trees.
Satisfiability of dominance constraint was proved NP-complete, but
Mehlhorn etal. [2001] distinguished the subclass of "normal
dominance constraints" and presented an algorithm taking time
weakly normal dominance
constraints

This new result has considerable practical import because dominance
constraints are widely used in computational linguistics, e.g. in
underspecified semantics

Jean-Yves Marion and Jean-Yves Moyen have used Petri nets in order to analyse the termination of assembler-like programs. They translate the program into a Petri net and show that the termination of the Petri net, which can be decided via linear programming techniques, implies the program's termination. This method allows to prove in a fully automated way the termination of some not-so-trivial programs such as Euclid's algorithm for computing the greatest common divisor or the Quicksort (and thus, potentially, all algorithms of the Divide and Conquer variety).

Moreover, this method can also prove the non-size-increasingness
property of some algorithms, that is, show that these program will
not use more memory than initially allocated and could be written
in C without using the malloc instruction.

This method of analysis seems to have some other good properties, such as offering the possibility of doing some basic program transformations (e.g., Deforestation) at the same time as proving the termination of both the initial and the transformed programs. It is still under study and the main research direction is a way to add a few features to the fundamental language, in particular a real stack to store intermediate values.

Calligramme is part of the ``Ingénierie des langues, du document, de l'information scientifique et culturelle'' theme of the ``contrat de plan État-Région''. Calligramme's contributions range over the syntactical and semantical analysis of natural language, the building of wide coverage lexical resources, and the development of software specialized for those tasks;

Calligramme, through Denys Duchier in particular, is a partner in the series of joint scienfic meetings organized through the Lorraine-Saarland partnership;

Calligramme is part of the ``Qualité et sûreté des Logiciels (QSL) theme of the ``contrat de plan État-Région''. As a matter of fact the fact the head of the whole QSL theme is Jean-Yves Marion;

Jean-Yves Marion is head of the ``Synthèse de code pour robots mobiles'' (Sycomor) project, granted by the scientific council of the INPL;

Calligramme is involved in the ACI Demonat, in section
``Nouvelles interfaces des mathématiques'', together with the
``Logique'' group of ``Université de Savoie'' and the ``TALaNa''
team of ``Université Paris 7''. The project concerns the parsing
and the checking of mathematical proofs written in natural
language.

Web page at

Calligramme is involved in the ACI CRISS, in section
``Sécurité informatique''. Its purpose, which can be read from the
full title, is ``Contrôle de ressources et d'interfaces pour les
systèmes synchrones''. It is headed by Roberto Amadio at the
University of Marseilles, and the co-ordinator on Calligramme's
side is Jean-Yves Marion.

Web page at

This ``nouvelles interfaces des mathématiques'' ACI regroups
several research teams in both mathematics and computer science
and is concerned, as its name implies, with the application to
computer science of techniques developed for modern geometry. It
is headed by Thomas Ehrhard at the cnrs in Marseilles.

Web page at

This is a smaller cnrs action whose full title is
``Topologie algébrique pour l'étude des structures de calcul et
notamment de la concurrence''. It is headed by Éric Goubault at
the Commissariat de l'Énergie Atomique, and many of its members
are also part of the previous action.

Web page at

Calligramme, through Jean-Yves Marion, is a participant in the Ministry of Industry RNTL project Averroes.

Calligramme is one half of a joint France-Germany (Procope) cooperation entitled ``Structures and Deductions'' with the Technische Universität Dresden, Germany.

Calligramme is involved in the IST-2001-38957 european project, in the FET-Open program, entitled ``Applied Semantics II''.

Calligramme is involved in the european network CoLogNET (Computational Logic Network) on the themes: logic methodology and foundational tools, logic and natural language processing.

Simonetta Ronchi della Rocca of the university of Turin spent a week in April, being the guest of both the Calligramme and Miró teams.

Lutz Straßburger, PhD student, University of Dresden, visited the group for one week in January and gave a talk on ``Linear Logic and Noncommutativity in the Calculus of Structures'';

Alessio Guglielmi, Charles Stewart and Phiniki Stouppa, University of Dresden, visited the group for one week in October and gave talks on ``A New Proof Theory with Deep Inference and Symmetry'', ``Partial Sharing Diagrams'', and ``Modal Logics in the Calculus of Structures'', respectively.

Glyn Morrill, of the Universitat Politècnica de Catalunya, made a short visit on December 12.

Guillaume Bonfante is member of the hiring committee, section 27, of the INPL, since April 2003;

Guillaume Bonfante was elected to the scientific council of the INPL in July 2003;

Adam Cichon is substitute member of the hiring committee, section 27, of the UHO–Nancy 2;

Adam Cichon was elected member of the ``Conseil National des Universités'' (CNU), section 27;

Jean-Yves Marion is member of the steering committee of the International workshop on Implicit Computational Complexity (ICC).

Philippe de Groote is:

elected permanent member of INRIA's evaluation board,

president of the INRIA-Lorraine audience commission for Junior Researchers (2nd class),

member of the INRIA admisssion jury for Junior Researchers (2nd class) and Senior Researchers (2nd class),

vice president of the LORIA Projects Committee,

member of the LORIA management board,

member of the scientific orientation council of the LORIA,

named member of the LORIA laboratory council,

permanent member of the hiring committe of the INPL (section 27),

member of the program committee of ESSLI-04,

member of the proghram committe of the 8th Conference on Formal Grammar (Vienna, Austria, August 16–17 2003),

member of the editorial board of

Papers in Formal Linguistics and Logic, Bulzoni, Roma,member of the editorial board of

Cahiers du Centre de Logique, Academia-Bruylant, Louvain-la-Neuve,

Since 2002 François Lamarche has been organizing the weekly Calligramme seminar. A particular event in 2003 was the organization of a ``structures week'', between the 6th and 10th of October.

Web page:

http://www.loria.fr/~lamarche/seminars.html François Lamarche is member of the board of the Département de Formation Doctorale of the Computer Science section of the Doctoral School.

François Lamarche is head of the Committee of Grants and Fellowships section of the Projects Committee of the LORIA.

Jean-Yves Marion is member of the hiring committee of Ecole des Mines (Professors and Lecturers), section 27, since February 2002.

Jean-Yves Marion was elected to the scientific council of INPL in July 2003.

Jean-Yves Marion organized the GEM-Stic Seminar on the safety and quality of software systems Jan 23 2003, Ecole des Mines de Nancy.

Web page:

http://www.loria.fr/~marionjy/GemStic230103/gemstic230103.html Jean-Yves Marion initiated and organized the monthly ``Journées QSL''

http://qsl.loria.fr Guy Perrier organized the

International Workshop on Parsing Technologies (IWPT 2003), which took place in Nancy from April 22 to 24. This event, which is held every two years, is the main international workshop on the parsing of natural languages and it was the first time that it was held in France.Guy Perrier was a member of the Program Comittee of the Lorraine-Saarland Workshop on Propects and Advances in the Syntax/Semantics Interface, which was held in Nancy on the 20th and 21st of October.

Denys Duchier organized

Prospects and Advances in the Syntax/Semantics Interface(October 20–21, 2003, Nancy,http://www.loria.fr/~duchier/Lorraine-Saarland/ ). This was the 6th workshop in the Lorraine-Saarland Workshop Series, a joint initiative of LORIA (Nancy) and MPI (Saarbrücken) whose primary purpose is to foster interaction and collaborative research between Lorraine and Saarland. This 2-days event attracted participants from France, Germany and Holland.Denys Duchier presented the metagrammar workbench prototype at the Lorraine-Saarland Workshop on Prospects and Advances in the Syntax/Semantics Interface, October 20–21, 2003, Nancy.

Denys Duchier and Ralph Debusmann presented the XDG workbench at the Lorraine-Saarland Workshop on Prospects and Advances in the Syntax/Semantics Interface, October 20–21, 2003, Nancy.

Adam Cichon, with Claude Kirchner, is in charge of the DEA course on ``logique et démonstration automatique'';

Adam Cichon was in charge of the DESS ``Compétences Complémentaires'' course on logic and artificial intelligence;

Adam Cichon heads the ``licence informatique'';

Guy Perrier is in charge of the organization of the course on

algorithmics for the parsing of natural languages, which he is teaching with Bertrand Gaiffe in the computer science DEA of Nancy.Jean-Yves Marion heads the option ``Maîtrise d'oeuvres et Maîtrise d'ouvrages (MOSI)'' of the computer science department of ENSMN.

Jean-Yves Marion is also in charge of the DEA course on complexity.

Denys Duchier was invited by Robin Cooper and Tobjörn Lager to give a series of lectures at the Graduate School of Language Technology (GSLT) in Göteborg, Sweden (March 2003). Topics covered were data management and concurrency in Oz, then constraint programming and application to natural language processing.

Guillaume Bonfante and Sylvain Pogodalla advised a second year student at École des Mines Student (Nicolas Barth) for a two month internship devoted to building flexed forms for 6,500 french verbs.

Philippe de Groote is supervising the thesis work of Sylvain Salvati.

Jean-Yves Marion and Guy Perrier are co-supervising the thesis work of J. Leroux.

Jean-Yves Marion and Olivier Bournez (Project-team Protheo) are co-supervising the thesis work of Paulin Jacobé du Naurois and Emmanuel Hainry.

Jean-Yves Marion supervised the thesis work of Jérôme Besombes and Jean-Yves Moyen.

Denys Duchier is supervising Ralph Debusmann's Ph.D. thesis (Saarbrücken). Ralph is developing the new meta-formalism called eXtensible Dependency Grammar (XDG), with special attention to the syntax/semantics interface.

We should note that three members of Calligramme saw jury duties from the other end: Jérôme Besombes defended his thesis on November 14 (with J.-Y. Marion and A. Cichon in the jury), Guy Perrier defended his Habilitation on November 12, and Jean-Yves Moyen defended his thesis on December 17 (with J.-Y. Marion in the jury).

Jean-Yves Marion was external referee for Y. Lenir, Université de Rennes, December 15.

François Lamarche was external referee for Lutz Straßsburger's thesis (TU Dresden, July 24).

François Lamarche was internal referee and jury member for Fairouz Chakkour's thesis (Université Henri Poincaré, December 8).

Jérôme Besombes defended his thesis on November 14 2003 (jury: A. Cichon, F. Denis, R. Gilleron, C. de la Higuera, M. Margenstern, J.-Y. Marion).

Jean-Yves Moyen defended his thesis on December 17 2003 (jury: R. Amadio, B. Gaujal, N. Jones, C. Kirchner, J.-Y. Marion, J. Souquières).

Guy Perrier defended his ``Habilitation'' thesis on November 12 2003 (jury: A. Abeillé, P. Blache, P. Blackburn, A. Dikovsky, G. Huet, J. Souquières, M. Steedman).

Seven members of Calligramme (J. Besombes, Ph. de Groote, F. Lamarche, J.-Y. Marion, G. Perrier, S. Pogodalla, S. Salvati) attended the GRACQ worshop at the LABRI, Bordeaux, March 17–18 2003.

Bruno Guillaume attended the CSL03 conference (Vienna, Austria, 25th-30th of August). He presented the paper

; Bruno Guillaume demonstrated the

Leoparprototype at the Lorraine-Saarland Workshop on Prospects and Advances in the Syntax/Semantics Interface;Bruno Guillaume and Philippe de Groote visited the ``Logique'' group of ``Université de Savoie'' in Chambéry (February 19 and 20);

Denys Duchier and Guy Perrier attended the Metagrammar reunion organized by Eric de la Clergerie (Université Paris VII, Septembre 18).

Philippe de Groote and Sylvain Salvati attended the 14th Internatinal Conference on Rewriting Techniques and Applications (RTA'03), Valencia, Spain, June 9–11, 2003. They presented

. François Lamarche attended the following meetings and seminars, and gave a talk at every one of them:

The 78th edition of the Peripatetic Seminar on Sheaves and Logic (Strasbourg, February 15–16).

The Séminaire CATIA (Université Montpellier

II, April 2–5).The ``Journées Squier et tout ça'' (Université Paris

VII, May 14–16).The Fields Institute Workshop on Mathematical Linguistics, (University of Ottawa, June 18–19).

The 2003 European Meeting on Category Theory (Haute-Bodeux, Belgium, September 8–12.)

The ``Topologie Algébrique'' meeting (Paris, November 26–27).

In addition to the papers

Jean-Yves Marion gave the following invited talks: Workshop ``AS Mobilités'' March 10, title: Extracting Feasible Programs.

Workshop on Linear Logic, Verona, December 17–18, title: Extracting Feasible Programs.

Jean-Yves Moyen gave a talk at the internal LACL (Laboratoire d'Algorithmique, Complexité et Logique) seminar in Créteil on November 24 (

http://www.univ-paris12.fr/lacl/ ).Lutz Straßburger visited the Proof Theory Group in Dresden for two weeks in September in order to work with Alessio Guglielmi, Paola Bruscoli and Ozan Kahrmanogullari;

François Lamarche, Sylvain Pogodalla and Lutz Straßburger attended the ``Structures Workshop'' at the University of Dresden, November 20–21;

Guy Perrier gave a talk at the Lorraine-Saarland Workshop on Propects and Advances in the Syntax/Semantics Interface, which was held in Nancy on the 20th and 21st of October 2003;

. Most of the members of the team attended the International Workshop on Parsing Techniques (IWPT03,

http://iwpt03.loria.fr/user/www/Nancy_en.html ) that was held in Nancy.Sylvain Pogodalla attended the North American Summer School in Logic, Language and Information (Bloomington, Indiana, June 17–21);

Sylvain Pogodalla attended the Mathematics of Language conference (Bloomington, Indiana, June 19–22) and presented

; Denys Duchier gave the following invited talk at the GSLT conference in Lökeberg, Sweden:

The Conspiracy Theory of Language.