Section: Overall Objectives
AVIZ's research on Visual Analytics is organized around five main Research Themes:
- Methods to visualize and smoothly navigate through large data sets:
Large data sets challenge current visualization and analysis methods. Understanding the structure of a graph with one million vertices is not just a matter of displaying the vertices on a screen and connecting them with lines. Current screens only have around two million pixels. Understanding a large graph requires both data reduction to visualize the whole and navigation techniques coupled with suitable representations to see the details. These representations, aggregation functions, navigation and interaction techniques must be chosen as a coordinated whole to be effective and fit the user's mental map.
- Efficient analysis methods to reduce huge data sets to visualizable size:
Designing analysis components with interaction in mind has strong implications for both the algorithms and the processes they use. Some data reduction algorithms are suited to the principle of sampling, then extrapolating, assessing the quality and incrementally enhancing the computation: for example, all the linear reductions such as PCA, Factorial Analysis, and SVM, as well as general MDS and Self Organizing Maps. AVIZ investigates the possible analysis processes according to the analyzed data types.
- Visualization interaction using novel capabilities and modalities:
The importance of interaction to Information Visualization and, in particular, to the interplay between interactivity and cognition is widely recognized. However, information visualization interactions have yet to take full advantage of these new possibilities in interaction technologies, as they largely still employ the traditional desktop, mouse, and keyboard setup of WIMP (Windows, Icons, Menus, and a Pointer) interfaces. At AVIZ we investigate in particular interaction through tangible and touch-based interfaces to data.
- Evaluation methods to assess their effectiveness and usability:
For several reasons appropriate evaluation of visual analytics solutions is not trivial. First, visual analytics tools are often designed to be applicable to a variety of disciplines, for various different data sources, and data characteristics, and because of this variety it is hard to make general statements. Second, in visual analytics the specificity of humans, their work environment, and the data analysis tasks, form a multi-faceted evaluation context which is difficult to control and generalize. This means that recommendations for visual analytics solutions are never absolute, but depend on their context.
In our work we systematically connect evaluation approaches to visual analytics research—we strive to develop and use both novel as well as establish mixed-methods evaluation approaches to derive recommendations on the use of visual analytics tools and techniques. AVIZ regularly published user studies of visual analytics and interaction techniques and takes part in dedicated workshops on evaluation.
- Engineering tools:
Currently, databases, data analysis and visualization all use the concept of data tables made of tuples and linked by relations. However, databases are storage-oriented and do not describe the data types precisely. Analytical systems describe the data types precisely, but their data storage and computation model are not suited to interactive visualization. Visualization systems use in-memory data tables tailored for fast display and filtering, but their interactions with external analysis programs and databases are often slow.
AVIZ seeks to merge three fields: databases, data analysis and visualization. Part of this merging involves using common abstractions and interoperable components. This is a long-term challenge, but it is a necessity because generic, loosely-coupled combinations will not achieve interactive performance.
AVIZ's approach is holistic: these five themes are facets of building an analysis process optimized for discovery. All the systems and techniques AVIZ designs support the process of understanding data and forming insights while minimizing disruptions during navigation and interaction.