This paper presents and evaluates interactive sonifications to sup- port periphery sensing and joint attention in situations with a limited field of view. Particularly Head-mounted AR displays limit the field of view and thus cause users to miss relevant activities of their interaction partner, such as object interactions or deictic references that normally would be effective to establish joint attention. We give some insight into the differences between face-to-face interaction and interaction via the AR system and introduce five different interactive sonifications which make object manipulations of interaction partners audible by sonifications that convey information about the kind of activity. Finally we present the evaluation of our designs in a study where participants observe an interac- tion episode and rate features of the sonification in questionnaires. We conclude the results into factors for acceptable sonifications to support dyadic interaction.