UX Studies Support

We have knowledge and experience in conducting and designing usability studies.

Usability study demonstation.

Mobile UX Studies

For the purpose of conducting usability studies of applications on mobile devices, we have prepared a proprietary solution that allows for both qualitative and quantitative analysis.

With the use of an eye-tracker in the form of glasses and an electrodermal response (EDA; also known as „galvanic skin response”, GSR) sensor, we can examine the use of applications in conditions similar to natural ones. We simultaneously record the point of gaze focus and the galvanic skin response, which allows us to determine the emotional arousal of the subject during command execution..

The result of the study is material showing the focus point of the subject’s gaze at a given moment, plotted on a so-called screencast of the smartphone screen, synchronized with the EDA signal, presented as a z-score, which is an easy-to-analyze graph that tells the strength of the galvanic skin response for a given person throughout the experiment.

Presentation of a usability study of a mobile application using eye-tracking and galvanic skin response.

Unlike other solutions, we do not place the phone on a stand, allowing free use of the device. We can carry out the survey in any location: a public place, a commercial establishment or a selected facility visited by customers.


Together with scientific partners we conduct research in which, in addition to using data recording from biosensors (such as eye-tracker, EDA, EEG, etc.), we work on proprietary software for analysis of research results using image processing.

Recording an eye-tracking study outdoors.

An example of such software is a system for automating eye-tracking data mapping that allows for quantitative analysis of research conducted in natural environments.

Mapping a video frames to a static reference image.

Client Behaviour Analysis

We have experience in automatic processing of video data using „classical” image processing methods as well as neural networks for detection and tracking of moving objects.

Thanks to advanced predictive models of neural networks, we can profile customers by their demographic structure (i.e., gender and age) and verify their interest in a selected object (e.g., a commercial stand).

Person detection with prediction of age, gender, and focus of attention on MOTS20 material.
Person detection with prediction of age, gender, and focus of attention on MOTS20 material.

Person detection with heat map on CAVIAR material.

Social Networks Analysis

Based on modern machine learning methods, such as deep neural networks, we analyze the content of photos posted in social networks. We confront the information gathered through predictive models with the content of post descriptions and hashtags.

An example of object detection in a photograph (source: Instagram)
Graph of tags related to #rowery (polish equivalent to #bikes)

Masks Detection

Using neural networks, we are able to perform detection of people who have a mask on their face – both if the data are sets of photos and video recordings. As a result, it allows us to statistically analyze what proportion of visitors have masks at a given time, when there are characteristic changes in the trend, etc.

Face detection with and without masks (in parentheses, the degree of prediction confidence of the algorithm, on a scale of 0 to 1).
Mask detection in the recording. (Additional Footage Provided By Videezy!)

Object detection

Using neural networks, we are able to perform detection of various kind of objects working with pictures, movies and even satellite images.

PosEmo – AI engagement and emotional valence

PosEmo is an attempt to understand better what people think and how they feel – implemented on a bigger scale than ever before, with the state-of-the-art artificial intelligence methods. The solution is based on technology that is already available to the people – a simple web camera. It allows you to track interest and assess attitude of the person in front of the camera in real time!