Visual attention can be a viable source of information to assess human behaviors in many different contexts, from human-computer interaction, over sports or social interactions, to complex working environments, such as to be found in the context of Industry 4.0. In such scenarios in which the user is able to walk around freely, mobile eye-tracking systems are used to record eye movements, which are then mapped onto an ego-perspective video. The analysis of such recordings then requires large efforts for manually annotating the recorded videos on a frame-by-frame basis to label the fixations based on their locations to the target objects present in the video. First, we present a method to record eye movements in 3D scenarios and annotate fixations with corresponding labels for the objects of interest in real-time 2. For this purpose, we rely on computer-vision methods for the detection of the camera position and orientation in the world. Based on a coarse 3D model of the environment, representing the 3D areas of interest, fixations are mapped to areas of interest. As a result, we can identify the position of the fixation in terms of local object coordinates for each relevant object of interest. Second, we present a method for real-time creation and visualization of heatmaps for 3D objects 1. Based on a live-streaming of the recorded and analyzed eye movements, our solution renders heatmaps on top of the object s urfaces. The resulting visualizations are more realistic than standard 2D heatmaps, in that we consider occlusions, depth of focus and dynamic moving objects. Third, we present a new method which allows us to aggregate fixations on a per object basis, e.g. similar to regions/areas of interest. This allows us to transfer existing methods of analysis to 3D environments. We present examples from a virtual supermarket, a study on social interactions between two humans, examples from real-time gaze mapping on body parts of a moving humans and from studying 3D prototypes in a virtual reality environment.