In this paper we develop a sonification model following the Model-based Sonification approach that allows to scan high-dimensional data distributions by means of a physical object in the hand of the user. In the sonification model, the user is immersed in a 3D space of invisible but acoustically active objects which can be excited by him. Tangible computing allows to identify the excitation object (e.g. a geometric surface) with a physical object used as controller, and thus creates a strong metaphor for understanding and relating feedback sounds in response to the user's own activity, position and orientation. We explain the technique and our current implementation in detail and give examples at hand of synthetic and real-world data sets.