This paper describes the motivation, data and sonification technique for three sound examples on the auditory display of human brain activity, selected and formatted as contribution to the ICAD aural submission category. The human brain generates complex temporal and spatial signal patterns whose dynamics correspond to normal (e.g. cognitive) processes and as well as abnormal conditions, i.e. disease. Our sonification technique Event-based Sonification allows to render multi-channel representations of the multivariate data so that temporal, spectral and spatial patterns can be discerned. Being a scientific approach, the sonifications are reproducible, systematic and the mapping is made transparent. Control parameters help to increase the saliency of specific features in the auditory display. This is demonstrated using data with sleep spindles, a photic response and epileptic discharges.
Since all 'sonic pictures' are rendered with the same technique, a variety of dynamic phenomena related to different brain states are demonstrated as auditory Gestalts. Sonification of the EEG offers a meaningful complement of the prevailing visual displays.