We introduce a method for detecting whether two users are engaged in focused interaction using a windowed correlation measure on their acoustic signals, assuming that a continued exchange of verbal turns contributes to anticorrelation of
acoustic activity. We tested our method with manually annotated transitions between focused and unfocused interaction stemming from experiments on AR-based coop-
eration within a research project on alignment in communication. The results show that a high degree and extended duration of speech activity anticorrelation reliably indicates focused interaction, and might thus be a valuable asset for situation-aware technical systems.