LCAS Thermal Physiological Monitoring Dataset

 

L-CAS Thermal Physiological Monitoring Dataset

Collected by the Optris PI-450 thermal imager

Description

This dataset is recorded for evaluating thermal-based physiological monitoring algorithms that can measure respiration and heart beat rate. The dataset contains thermal images of different human faces acquired in the Lincoln Centre for Autonomous Systems (L-CAS) at the University of Lincoln, UK. Data were recorded into different rosbag files, each corresponding to a person. The thermal camera recorded each person for two minutes with a frequency of 27Hz. People were asked to keep static in the first one minute, then move their head up and down, forward and back, turning right and left, each action was held for 10 seconds.

Contributions

This dataset provides:

  1. Robot Operating System (ROS) rosbags. Each rosbag contains about 3,000 continuous thermal images.
  2. Ground truth for respiration and heart beat

 

Recording platform

 

The Optris PI-450 thermal imager was mounted at 1.3m from the floor, on the top of a Kompaï robot. The Kinect was mounted at 1.1m. The distance between the robot and the face was about 1.3m. Thermal images were recorded using the Optris official driver. Temperature data is encoded as:

float t = (float)(data - 1000) / 10.f.

Optris PI-450 thermal imager parameters
Optical resolution: 382 x 288 pixels
Frame rate: 10 Hz
Measurement range: 20 – 100 °C
Optics field of view: 38° x 29° / f = 15 mm oder

Download

Since the dataset is big, please send an email to scosar@lincoln.ac.uk for instructions to download the dataset. 

Software

The physiological monitoring software developed by LCAS can be found on GitHub as a ROS node implementation.

License

Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.
Copyright (c) 2016 Zhi Yan, Serhan Cosar, and Nicola Bellotto.

Funding

This work was funded in part by the EU Horizon 2020 project ENRICHME, H2020-ICT-2014-1, Grant agreement no.: 643691.