Multiparameter imaging techniques provide large numbers of high-dimensional image data in modern biomedical research. Besides algorithms for image registration, normalization and segmentation, new methods for interactive data exploration must be proposed and evaluated. We propose a new approach for auditory data representation, based on sonification. The approach is applied to a multiparameter image data set, generated with immunofluorescence techniques and compared to a conventional visualization approach and to a combination of both. For comparison, a psychophysical experiment was conducted, in which one standard evaluation procedure is modeled. Our results show, that all three approaches lead to comparable evaluation accuracies for all subjects. We conclude, that both, acoustical and visual approaches can be combined to display data sets of large dimensionality.