Personal Computers have arrived in almost every part of our live to do work faster and better. They are used for writing texts, creating music or drawings, or simply organizing and guiding everyday tasks. Nearly all these tasks are done with computers which are operated using screen, keyboard and mouse even if their use may be sometimes cumbersome or even unsuitable for some tasks. Human Computer Interaction (HCI) aims to analyze the way people use computers and suggest new methods for interaction. One area of this research field is called 'Tangible Interaction'. Tangible Interaction tries to use everyday objects as tangible representations for digital data. It is hoped that by pulling the data into the tangible real world (in contrast to the virtual world) they can be made more vivid and graspable and thereby better understandable. These real-world representations are called Tangible User Interface Objects (TUIOs) and the systems they are used in Tangible User Interfaces (TUIs).
The main goal of this work is to create active objects as a new kind of TUIO.
These active objects extend the concept of TUIOs by the possibility to be not only manipulated by the user but also by the computer. Many different ways of manipulation are possible, e.g. adding LEDs or liquid crystal displays, sound output or tactile and haptic feedback with vibration, etc. One of the most challenging manipulation possibilities is computer controlled planar movement for instance on a desk surface, which will be developed in this work. The developed objects are constructed as modular as possible to be open for future extensions and modifications. A software structure for the coordination of the objects is implemented. Furthermore some applications shown to give examples for the potential of this novel technique.