AMI is concerned with new multimodal technologies to support human
interaction, in the context of instrumented meeting rooms and remote
meeting assistants. The project aims to enhance the value of multimodal
meeting recordings and to make human interaction more effective in real
time. These goals are being achieved by developing new tools for
computer supported cooperative work and by designing new ways to search
and browse meetings as part of an integrated multimodal group
communication, captured from a wide range of devices.
This Integrated Project addresses a wide range of
critical multi-disciplinary activities and applications, including:
multimodal input interfaces (primarily speech and visual input);
integration of modalities and coordination among modalities, e.g.
(asynchronous) multi-channel processing; meeting dynamics and
human-human interaction modelling; content abstraction, including
multimodal information indexing, summarising, and retrieval; technology
transfer; and training activities, including an international exchange
Relevance to HYDRA:
Modelling approaches addressed by AMI project can bring the usefull
knowledge for design of multimodal interfaces in HYDRA project, mainly
in area of multi-chanel signal processing, multimodal information
modelling, information retrieval and content abstraction.