We study a signaling game of common interest in which a stochastic noise is perturbing the communication between an informed sender and an uninformed receiver. Despite this inhibiting factor, efficient languages exist. In equilibrium, sender uses a tessellation consisting of convex cells while receiver converts posterior beliefs into Bayesian estimators serving as interpretations. Shannon entropy measures the noise level and describes to which extent communication is possible. A limit case of errors that respect the distance between words leads to concise interpretations in the decoding process. Comparative statics for different levels of noise reveal which grammatical structures are more robust towards noise. For increasing error separation between most distinct types becomes more important than precision about each single one. Furthermore, distinct words are saved for the description of opposite domains of the type space. Evolutionary modeling approaches converge to equilibria, but not every equilibrium is stable.