This paper presents and evaluates auditory representations for object interactions as support for cooperating users of an Augmented Reality (AR) system. Particularly head-mounted AR displays limit the field of view and thus cause users to miss relevant activities of their interaction partner, such as object interactions or deictic references that normally would be effective to establish joint attention. We start from an analysis of the differences between face-to-face interaction and interaction via the AR system, using interaction linguistic conversation analysis. From that we derive a set of features that are relevant for interaction partners to co-ordinate their activities. We then present five different interactive sonifications which make object manipulations of interaction partners audible by sonification that convey information about the kind of activity. In this paper we furthermore evaluate our designs in a study where participants observe an interaction episode and rate features of the sonification in questionnaires. In result, we can conclude insights into factors for acceptable sonifications to support dyadic interaction.