Since their inception, machines, computers and robots have steadily grown in complexity
to solve ever more complicated problems. For these systems of ever-growing
complexity to be usable by the largest number of people, they need to be made affordant
through tests evaluating system interactions. These tests have however shortcomings,
leaving users sometimes lost and frustrated with these systems.
In this work, we improve these interface evaluation test by relying on a Brain-Computer
Interface combining Electroencephalography with Eyetracking. We use this bi-modal
setup to provide complementary insights about a user’s perception which can be gathered
from any interaction scenario. To achieve this, we have created a set of methods
which allow our system to be applicable and informative in a variety of situations. For
scenario transposability we developed the Fixation-based Component Synchronization
method, allowing to reestablish synchronous recordings even when markers are lacking.
Using both recording modalities and the Fixation-related Potentials observable thanks to
them, we propose four different methods which provide insight into how user’s perceive
the considered interaction. These four methods are the General Difficulty via Eyetracking
(GDET) method, the Steady Peak Property Quantification (SPPQ) method, the Segment
Frequency Bands Analysis (SFBA) method and the User-dependent Potential Variation
(UdPV) method. These four methods provide respectively information about difficulties
relating to the explored environment as a whole, specific elements in the environment,
the task with which the environment is explored and specificities about the strategy with
which the user explores the environment. We discuss and test the extent of all four of
these methods in a series of three laboratory studies presenting artificial and natural interaction
scenarios. The three scenarios presents different tasks and levels of difficulty
allowing to establish the utility of these methods and verify their transposability between
situations. All proposed methods are simple to implement and offer a new way to approach
the analysis of interaction, both in a design environment and as a promising way
to create adaptive interfaces.