Background
Interaction with earables – earphones equipped with additional sensors – has been identified as one of the four major areas of earable research (Röddiger et al., 2022, Hu et al., 2025). Worn naturally and positioned near key physiological signals, earables support a wide range of interaction modalities and have demonstrated the ability to detect multiple inputs simultaneously. A significant portion of current research in the earable domain focuses on detecting either novel interaction types or supporting a wide variety of simultaneous inputs (Hummel et al., 2025). However, in professional contexts – such as operating machinery, piloting, or medical environments – reliability and robustness of input methods often outweigh novelty or variety. In such high-stakes settings, fail-safe input mechanisms are crucial to prevent accidental commands and ensure system trustworthiness. Moreover, these contexts often involve cognitive and physical multitasking, where the user’s hands and eyes are already occupied. Earables have the potential to serve as an additional, non-intrusive input channel, acting almost like a “third hand” for reliable control. To be applicable in professional settings, however, interaction should fit naturally in professional contexts.
The goal of this thesis is therefore to design and implement failsafe gestures for OpenEarable 2.0 that (1) require no use of hands or visual focus, (2) fit naturally into a professional working environment, and (3) can be reliably and robustly detected in real-time using OpenEarable 2.0’s available sensors (Röddiger et al., 2025) and computing resources.
Your Tasks
-
Identifying candidate gestures that meet the above constraints
-
Developing a real-time detection algorithm with built-in safeguards to minimize false activations while remaining practical for everyday use
-
Evaluating the robustness and fail-safety of the proposed interaction techniques across different usage scenarios and noise condition.
Requirements
- Ability to translate abstract interaction concepts and requirements into a functional, working system
- Interest in Human-Computer-Interaction (HCI) and the real-world application of new devices
- Good Python skills
Application Documents
- A paragraph explaining your motivation.
- Your study program (Bachelor/Master), current semester, and field of study.
- A transcript of records (courses and grades).
- Your programming experience.
- Any areas of interest relevant to the topic.
- Your CV (if available)


