QMB Morning Dataset
The proliferation of wearable cameras has accelerated and facilitated a surge in research on analysis of egocentric videos. This relatively new field has relatively few research datasets. A significant share of publicly available egocentric datasets is purposely acquired for activity recognition, video summarization, object detection, and behavioural understanding. Here we share our dataset that is focused on personal localization and mapping.
We collected our dataset on the university campus, documenting a user’s typical morning. The recording is always initiated at the building entrance, at which point the user enters and triggers the session. The dataset was collected indoors before working hours to ensure people were not mistakenly captured, avoiding potential privacy challenges. To further guard ourselves, we recorded the dataset exclusively in an indoor environment to limit strangers’ appearance in the field of view. The five sessions cover a one-month period. During recording, the user’s operation was in line with a loosely worded script to ensure that multiple visits were made to a range of locations. In total, we captured nine distinct stations: Entrance, 3d-lab, Kitchen 1, Kitchen 2, Cafe-area, Lab, Printer 1, Printer 2 and Office.
The dataset includes:
- RGB 1080p videos (without audio) data at 25 Hz
- Visit segmentation with annotated stations in ELAN format
The QMB Morning dataset is licensed under a Creative Commons Attribution 4.0 International (CC BY 4.0). In addition, reference must be made to the following publication whenever research making use of this dataset is reported in any academic publication or research report:
Suveges, T. and Mckenna, S.J., 2020. Egomap: Hierarchical First-person Semantic Mapping. 8pp. Oral presentation at 2nd Workshop on Applications of Egocentric Vision (EgoApp) In conjunction with International Conference on Pattern Recognition (ICPR)
- Install ELAN annotation tool to open *.eaf files.
- Dataset videos and the ground-truth annotations: 01, 02, 03, 04, 05
- Scripts for recording: scripts.pdf
|Stations||Entrance||3D-lab||Kitchen 1||Caffe-area||Lab||Printer 1||Kitchen 2||Printer 2||Office|
|Total visit time (mm:ss)||01:00||37:20||38:32||28:58||47:26||03:00||01:23||04:04||02:37|
- Suveges, T. and Mckenna, S.J., 2020. Egomap: Hierarchical First-person SemanticMapping . 8pp. Oral presentation at 2nd Workshop on Applications of Egocentric Vision (EgoApp) In conjunction with International Conference on Pattern Recognition (ICPR).
This research received funding from the UK EPSRC project (EP/N014278/1) titled ACE-LP: Augmenting Communication using Environmental Data to drive Language Prediction.
Special thanks to Max Wilson for scripting and recording the videos.
For comments, suggestions or feedback, or if you experience any problems with this website or the dataset, please contact Stephen McKenna