How do you visualize complex data for people... who cannot see? Researchers at the Bielefeld University (Germany) propose a sophisticated solution [uni-bielefeld.de]: they combined a set of physical objects that can autonomously move with sonification, or the generation of data-driven sounds. This non-visual visualization method should allow visually impaired people to explore multivariate data through the alternative representation of scatterplots. Based on some past insights on multi-touch enabled visual display, this approach overcomes the obvious problems in terms of visualization and interaction.
How does it work? The researchers created a 2D transformation of the spatially distributed data into the audio-haptic domain. First, a set of cube objects physically move to locations that correspond to the most explicit data clusters on a horizontal screen. These constellations can then be perceived (i.e. felt) by users. By moving a physical object over a screen, specific sounds are emitted so that the local characteristics of the data distribution can be distinguished. Or, in other words, the frequency of a continuously emitted sonic stream corresponds to the local density of the data. When an object is released, a local data sonogram is created, yielding an audible spherical sweep through the data space at the location of the object. Still sounds too complex? Then watch a demonstration video below.