Since decades honeybees are being used as an insect model system for answering scientific questions in a variety of areas. This is due to their enormous behavioural repertoire paired with their learning capabilities. Similar learning capabilities are also evident in bumblebees that are closely related to honeybees. As honeybees, they are central place foragers that commute between a reliable food source and their nest and,
therefore, need to remember particular facets of their environment to reliably find back to these places.
Via their flight style that consists of fast head and body rotations (saccades)interspersed with flight segments of almost no rotational movements of the head (intersaccades)
it is possible to acquire distance information about objects in the environment.
Depending on the structure of the environment bumblebees as well as honeybees can use these objects as landmarks to guide their way between the nest and a particular
food source. Landmark learning as a visual task depends of course on the visual input perceived by the animal’s eyes. As this visual input rapidly changes during head saccades,
we recorded in my first project bumblebees with high-speed cameras in an indoor flight arena, while they were solving a navigation task that required them to orient according to landmarks. First of all we tracked head orientation during whole flight periods that served to learn the spatial arrangement of the landmarks. Like this we
acquired detailed data on the fine structure of their head saccades that shape the visual input they perceive. Head-saccades of bumblebees exhibit a consistent relationship between their duration, peak velocity and amplitude resembling the human so-called
"saccadic main sequence" in its main characteristics. We also found the bumblebees’saccadic sequence to be highly stereotyped, similar to many other animals. This hints
at a common principle of reliably reducing the time during which the eye is moved by fast and precise motor control.
In my first project I tested bumblebees with salient landmarks in front of a background covered with a random-dot pattern. In a previous study, honeybees were trained with the same landmark arrangement and were additionally tested using landmarks that were camouflaged against the background. As the pattern of the landmark textures did not seem to affect their performance in finding the goal location, it had been assumed
that the way they acquire information about the spatial relationship between objects is independent of the objects texture.
Our aim for the second project of my dissertation was therefore to record the activity of motion sensitive neurons in the bumblebee to analyse in how far object
information is contained in a navigation-related visual stimulus movie. Also we wanted to clarify, if object texture is represented by the neural responses. As recording from neurons in free-flying bumblebees is not possible, we used one of the recorded bumblebee
trajectories to reconstruct a three-dimensional flight path including data on the head orientation. We therefore could reconstruct ego-perspective movies of a bumblebee
10 while solving a navigational task. These movies were presented to motion-sensitive neurons in the bumblebee lobula. We found for two different classes of neurons that
object information was contained in the neuronal response traces. Furthermore, during the intersaccadic parts of flight the object’s texture did not change the general response profile of these neurons, which nicely matches the behavioural findings. However, slight changes in the response profiles acquired for the saccadic parts of flight might allow to extract texture information from these neurons at later processing stages.
In the final project of my dissertation I switched from exploring coding of visual information to the coding of olfactory signals. For honeybees and bumblebees olfaction is approximately equally important for their behaviour as their vision sense. But whereas there is a solid knowledge base on honeybee olfaction with detailed studies on the
single stages of olfactory information processing this knowledge was missing for the bumblebee. In the first step we conducted staining experiments and confocal microscopy
to identify input tracts conveying information from the antennae to the first processing stage of olfactory information – the antennal lobe (AL ). Using three-dimensional reconstruction of the AL we could further elucidate typical numbers of single spheroidal
shaped subunits of the AL , which are called glomeruli. Odour molecules that the bumblebee perceives induce typical activation patterns characteristic of particular
odours. By retrogradely staining the output tracts that connect the AL to higher order processing stages with a calcium indicator, we were capable of recording the odourdependent activation patterns of the AL glomeruli and to describe their basic coding principles. Similarly as in honeybees, we could show that the odours’ carbon chain
length as well as their functional groups are dimensions that the antennal lobe glomeruli are coding in their spatial response pattern. Applying correlation methods underlined the strong similarity of the glomerular activity pattern between honeybees and bumblebees.