Outdoor Action/Intention Recognition Dataset (RASberry)

The dataset was recorded outside, on a piece of grassland, under varying lighting conditions (sunny, cloudy, morning to afternoon) and at distances ranging from 5m to 50m, at 5m intervals. Recording at different distances allows us to determine the performance of sensors and algorithms over the interaction range that the robot will face in action.

We recorded 10 actors, performing every activity once at each distance. Behaviours were performed from the front, back and side for a basic coverage of different directions.

The gestures were chosen for their relevance in basic communication between human and robot, the activities as a sample of interesting behaviour displayed by human fruit pickers.

The following list gives a short overview of dataset features:

  • Distances: 5m ̵ 50m at 5m intervals
  • Actors: 10 different actors, recorded individually
  • Sensors: ZED stereo camera (RGB video and depth video), Optris thermal camera (thermal video), Velodyne VLP-16 (3D point clouds)
  • Gestures: Waving, beckoning, indicating to stop, shooing, thumb up, thumb down, lower arm up, lower arm down, pointing
  • Activities: Walking*, turning*, crouching down, standing up; with and without crate in hands

Due to personal data protection considerations, we ask that you contact us to access the dataset:

  • Alexander Gabriel: agabriel@lincoln.ac.uk
  • Paul Baxter: pbaxter@lincoln.ac.uk

 

Supported by the RASberry project: https://rasberryproject.com/