- Human-Computer Interaction

Human-Computer Interaction Group – Prof. Dr. Harald Reiterer
Master
Mulituser Navigation on Large-Scaled Interactive Surfaces
Scenario
Interactive surfaces like tabletops and walls often implement a panning and zooming navigation style,
whenever only a limited amount of physical display space is available for navigating a larger virtual
canvas of spatially distributed information. Mapping and planning applications which show vast
amounts of visual data are one example for such panning and zooming User Interfaces.
Current interaction techniques for navigation in spatial user interfaces are designed for single user
interaction. However these techniques like pinching are applied to large scaled interactive surfaces as
well. This leads to certain problems during a multi-user interaction. For instance when one user wants
to zoom a map and the other user simultaneously wants to pan the spatial interface.
Project Goal
As a first step to solve this problem we implemented a body-driven interaction technique called Body
Panning. On this basis a user can pan through a spatial user interface by adjusting her position at the
table. We received a very good feedback at the ITS 2013. Thus there are a lot of suggestions and ideas
for a future development.
Task





Literature research (seminar thesis)
Design and discussion of several interaction concepts (project work)
Implementation of a prototype (project work) (C#)
Evaluation of different concepts
Documentation and presentation
Contact
Daniel Klinkhammer
Büro: PZ905
[email protected]
Announcement April 15th 2015
Human-Computer Interaction Group – Prof. Dr. Harald Reiterer
Bachelor / Master
Eating Well by Color
Scenario
Wellness applications help users record information about their health behaviors, such as physical
activity and diet. Tracking ranges from counting steps to tracking the stages of sleep. Also calorie
counters which make use of large calorie databases are part of self-monitoring. The calorie counters
are based on the assumption that if we are aware of what we’re eating, we have the information to
know how to eat healthier. Although this may be true it is very annoying to keep track of everything
we eat. This information cannot be gathered automatically but has to be recorded manually, which
leads to missing data and imprecisions. Furthermore to eat healthier a balanced nutrition should
receive greater attention than counting calories.
Project Goal
Different nutrients actually impart different colors to the foods they’re in. Therefore a common ground
in food science is that every meal should include colored foods like fruits and vegetables. Based on
this knowledge an app can be developed to analyze meals with respect to the colors. An automatic
image processing approach based on an image of the meal is intended. Furthermore a visualization
has to be developed to give feedback about how colorful or balanced the meal is.
Task




Literature research and state-of-the-art analysis (seminar thesis)
Design and discussion of several interaction and visualization concepts (project work)
Implementation of a prototype for Android (project work)
Evaluation of the image processing approach and the visualization (thesis)
Contact
Simon Butscher
Room: PZ 906
[email protected]
Announcement April 15th 2015
Human-Computer Interaction Group – Prof. Dr. Harald Reiterer
Bachelor / Master
Providing Self-Monitoring Feedback
Scenario
Wellness applications help users record information about their health behaviors, such as physical
activity and diet. Tracking ranges from counting steps to tracking the stages of sleep. Also calorie
counters which make use of large calorie databases are part of self-monitoring. In order to increase the
effectivity of self-monitoring for changing the behavior of the users we do not only have to track the
data but also have to give feedback. Most applications only provide simple forms of feedback such as
showing the number of steps that one has to take to meet one’s goal. However more detailed insights
about the behavior can be provided by unveiling relations between the gathered data (e.g., that meals
at work are less healthy than at home).
Project Goal
Based on data about the nutrient of meals, tracked activities and some additional data like eating
motives or time and location where meals took place visualizations and mechanism to analyze the
data should be developed. The concepts should address the needs of end users and not the needs of
expert data analysts. Therefore the concepts do not need to allow a detailed analysis, but have to
provide a simple access to the gathered data.
Task




Literature research and state-of-the-art analysis (seminar thesis)
Design and discussion of several interaction and visualization concepts (project work)
Implementation of a prototype for Android (project work)
Evaluation of the visualization and data analysis concept (thesis)
Contact
Simon Butscher
Room: PZ 906
[email protected]
Announcement April 15th 2015
Human-Computer Interaction Group – Prof. Dr. Harald Reiterer
Master
Investigating co-located egocentric collaboration
Scenario
Egocentric navigation represents a technique which allows the user to explore off-screen content by
moving in physical space rather than panning screen content. In co-located (=same place)
collaborative scenarios, egocentric navigation could to be particularly beneficial as it allows (explicit
and implicit) interaction between the collaborators in the real world.
Studies on this technique have shown cognitive advantages on an individual level. Little, however, is
known regarding its suitability in collaborative scenarios.
Project Goal
Implement a study prototype that addresses the following research question: Which effects do
egocentric input styles have on collaborative tasks when compared to non-egocentric input styles?
Successful participation in „Usability Engineering: Evaluation“ is mandatory.
Task



Research on related studies on co-located collaboration and on egocentric navigation; refinement
of research question (Seminar)
Design of a study plan and a research prototype; implementation of prototype (Project)
Answering the research question and discuss results through an experimental study (Thesis)
Contact
Jens Müller
Room: PZ 906
[email protected]
Announcement April 15th 2015
Human-Computer Interaction Group – Prof. Dr. Harald Reiterer
Bachelor
Pfahlbauten/Stilt house
Scenario
„The visitor puts on the virtual reality glasses and the earplugs provided by the museum guide and
dives into the ancient life of stilt house settlers around 3910 – 3909 BC. The experience begins as
immersion is taking over. She then witnesses a change of seasons, watches snow flakes falling onto
the roof tops of the wooden settlements and listens to the rain drops shattering on the surface of Lake
Konstanz...“
Project Goal
Design and implement a virtual reality concept that tells the story of stilt house (Pfahlbauten) settlers
at Lake Konstanz. The project is undertaken in close cooperation with the HTWG (University of Applied
Sciences).
Programming skills (c# / Unity), the successful participation in „Usability Engineering: Design“ as well
as a good time management are highly recommended.
Task



Literature research on public interactive installations and virtual reality (seminar)
Concept development of an interactive VR installation (project)
Evaluation and reflection of your concept (thesis & thesis defence)
Contact
Jens Müller
Room: PZ 906
[email protected]
Announcement April 15th 2015
Human-Computer Interaction Group – Prof. Dr. Harald Reiterer
Bachelor / Master
Food tracking on mobile devices.
tomato; 30 g
turkey; 100 g
cucumber; 30 g
Scenario
Food tracking is a common and important feature of mobile wellness applications. Currently,
automated approaches (e.g. based on image processing of the camera image), are not sufficiently
accurate yet. Thus, in terms of user experience, the feature of food tracking rises an important
question: How can the burden of data entry be reduced on the user?
corn; 50 g
beetroot; 120 g
mushroom (in oil); 50 g
Project Goal
Design, implement, and evaluate an interactive concept that is capable to accurately document food
in real-life situations. Food tracking refers to the classification (What have I eaten?) and the
quantification (How much have I eaten?) of meals.
Successful participation in „Usability Engineering: Design“ and programming skills in Android are
highly recommended.
Task



State-of-the-art analysis on food tracking approaches in the mobile context (seminar)
Concept development and implementation (project)
Evaluation and reflection of your concept (thesis & thesis defence)
Contact
Jens Müller
Room: PZ 906
[email protected]
Announcement April 15th 2015
Human-Computer Interaction Group – Prof. Dr. Harald Reiterer
Bachelor / Master
Interactive classical music
Scenario
Composing classical music is a complex matter. A clear structure has to be shaped and different
instruments have to be combined to form an overarching impression conveying certain moods and
feelings. To understand and visualise the arrangement of the movements for laymen is a challenging
task. However, it is essential to present the underlying structure of the music if amateurs are to be
integrated into the composition process.
This system will visualise classical music consisting of several building blocks that can be listened to
and partly arranged separately.
Project Goal
The outcome of this project is an interactive prototype that enables the users to contribute to
composing a piece of classical music in a playful manner, combining different input modalities (such
as tangibles or touch) with visual and audio output.
Task



state-of-the-art analysis of existing systems and technological possibilities (seminar presentation &
paper)
implementation of an interactive prototype (project presentation & paper)
in-the-wild evaluation of the system (thesis & thesis defence)
Contact
Jens Müller, Svenja Leifert
Rooms: PZ 906, PZ 904
[email protected]
[email protected]
Announcement April 15th 2015
Human-Computer Interaction Group – Prof. Dr. Harald Reiterer
Bachelor / Master
Automated Evaluation with Squidy
Scenario
The data processing tool Squidy is used to integrate input devices into natural user interfaces. By
building pipelines through different filters, data is manipulated in a visually appealing way detached
from the technical background processes. Although Squidy offers a wide variety of settings, a live
evaluation framework has not been integrated yet.
Project Goal
The goal of this project is to design a framework that uses Squidy to automatically analyse/plot data
gathered in usability tests, e.g. a tapping test. The project should be based on scientific findings as
found in related literature and be evaluated in user studies accordingly.
Task



Literature research, state-of-the-art analysis (seminar presentation & paper)
Development of test setting & tasks, implementation of test framework (project presentation &
paper)
Conduction of user studies & analysis (thesis & thesis defence)
Contact
Svenja Leifert
Room: PZ 904
[email protected]
Announcement April 15th 2015
Human-Computer Interaction Group – Prof. Dr. Harald Reiterer
Bachelor / Master
Evaluation of Motor Memory in Static Peephole Environments
Scenario
Many evaluations lead to the conclusion that body movement has a strong impact on how well we
memorize spatial location. The following problems are amongst those that are yet to be analysed in
detail when working with static peepholes (e.g. ZUIs, zoomable user interfaces):
How does the size of touch displays (and accordingly the extend of body movement) impact spatial
memory? To what extend is it influenced by the directness of interaction (remote, direct)? What other
factors might have an influence and how can they be used to support user interaction and
navigation?
Project Goal
As a first step, related literature is to be evaluated as a theoretical foundation. According to the
research question, a test task is to be developed. An interactive prototype is to be implemented or
adjusted in order to conduct a usability study for answering one of the above-mentioned research
questions. The results need to be statistically analysed to add to the body of knowledge.
Task



Literature research, state-of-the-art analysis (seminar presentation & paper)
Development of test setting & tasks, implementation of test framework (project presentation &
paper)
Conduction of user studies & analysis (thesis & thesis defence)
Contact
Svenja Leifert
Room: PZ 904
[email protected]
Announcement April 15th 2015
Human-Computer Interaction Group – Prof. Dr. Harald Reiterer
Bachelor / Master
Using Eye Tracking in Spatial Memory User Studies
Scenario
Our eyes are continually moving, adapting to our point of focus or adjusting the line of sight. Gaze
motion is not only extremely fast, but even works subconsciously most of the time, allowing for
interesting insights to be gained during human-computer interaction.
In cognitive psychology, recent study results hint at a correlation between (spatial) memory and eye
movement towards relevant screen positions. Due to the subconscious nature of gaze motion, this
connection could offer a fast and faithful feedback of users’ spontaneous reaction when asked about
past events and locations.
Project Goal
As a first step, related literature is to be evaluated to establish a theoretical foundation of how HCI
could make use of eye tracking in memory experiments. According to the research question(s), this
also requires the development of one or several test tasks. An interactive prototype is to be
implemented or adjusted in order to conduct a usability study for answering the research question(s).
The results need to be statistically analysed and discussed accordingly.
Task



Literature research, state-of-the-art analysis (seminar presentation & paper)
Development of test setting & tasks, implementation of test framework (project presentation &
paper)
Conduction of user studies & analysis (thesis & thesis defence)
Contact
Svenja Leifert
Room: PZ 904
[email protected]
Announcement April 15th 2015
Human-Computer Interaction Group – Prof. Dr. Harald Reiterer
Master
Gaze Control as Future Automotive Application
Scenario
It has long been a dream in automotive industry to be able to lessen work load on car drivers by
replacing some of the manual input (e.g. when manipulating the centre console) by touchless
interaction such as gaze control. Nowadays, a multitude of eye tracking systems offer the possibility of
realising what had seemed to be impossible only a few years ago. Mobile or head-mounted trackers
might be used either in a wide range to enable single components or in a more narrow field to
control only a single component such as the head-up display. Menu selection, adjusting air
conditioning or the like will then be possible, while the hands are free to control the steering wheel.
Project Goal
This project has two main goals to be worked on both theoretically and practically: 1) To research
available eye tracking solutions and evaluate their (dis-)advantages. This includes the integration of
the most suitable solution into a simple demonstrator cockpit by finding the optimal tracker position.
2) To develop the two mentioned usage scenarios (wide and narrow field of interaction) showing how
eye tracking and gaze control may be used in automotive applications.
Task



internet, industry and literature research on eye-tracking systems (seminar presentation & paper)
integration of eye tracker into prototypical automotive environment, evaluation of optimal position
& usage scenarios, development of test application (project presentation & paper)
evaluation of test set-up and test application, analysis and implications for future work (thesis &
thesis defence)
Contact
Thomas Kälberer, Svenja Leifert
Rooms: PZ 908, PZ 904
[email protected]
[email protected]
Announcement April 15th 2015