Detecting changes such as moved, removed or new objects is the essence for numerous indoor applications in robotics such as tidying-up, patrolling and fetch/carry tasks.

We provide a dataset that can be used to evaluate methods, which are able to detect changed objects when comparing two recordings of the same environment at different time instances. Based on the labeled ground truth objects, it is possible to differentiate between static, moved, removed and novel objects.


Dataset Description

The dataset was recorded with an Asus Xtion PRO Live mounted on the HSR robot. We provide scenes from five different rooms or parts of rooms, namely a big room, a small room, a living area, a kitchen counter and an office desk.
Each room is visited by the robot at least five times while between each run a subset of objects from the YCB Object and Model Set (YCB)[1] is re-arranged in the room. In total we generated 26 recordings.
For each recording between three and 17 objects are placed (219 in total).
Furthermore, furniture and permanent background objects are slightly rearranged. These changes are not labeled because for most service robot tasks, this is not relevant.

Assuming most objects are placed on horizontal surfaces, we extracted planes in each room in a pre-processing step (excluding the floor). For each surface, all frames from the recording where it is visible are extracted and used as the input for ElasticFusion[2]. This results in a total of 34 reconstructed surfaces.

We provide pointwise annotation of the YCB objects for each surface reconstruction from each recording.

The following images show examples of plane reconstructions.



Dataset Structure

- scene2
  - planes
    - 0
      - merged_plane_clouds_ds002.pcd
      - merged_plane_clouds_ds002.anno
      - merged_plane_clouds_ds002_GT.anno
    - 1
      - merged_plane_clouds_ds002.pcd
      - merged_plane_clouds_ds002.anno
      - merged_plane_clouds_ds002_GT.anno 
    - ...
- scene3

The pcd-file contains the reconstruction of the surface. The merged_plane_clouds_ds002.anno lists the YCB objects visible in the reconstruction and merged_plane_clouds_ds002_GT.anno contains the point indices of the reconstruction corresponding to the YCB objects together with the corresponding object name. The last element for each object is a bool value indicating if the object is on the floor (and was reconstructed by chance).

The table.txt lists for each detected plane the centroid, height, convex hull points and plane coefficients.


You may also be interested in Object Change Detection Dataset of Indoor Environments. It uses the same input data, but the ground truth annotation is based on a full room reconstruction instead of individual planes.




Research paper

If you found our dataset useful, please cite the following paper :

title={Where Does It Belong? Autonomous Object Mapping in Open-World Settings},
author={Langer, Edith and Patten, Timothy and Vincze, Markus},
journal={Frontiers in Robotics and AI},



For any questions or issues with the dataset, feel free to contact the author:

  • Edith Langer – email: langer@acin.tuwien.ac.at



[1] B. Calli, A. Singh, J. Bruce, A. Walsman, K. Konolige, S. Srinivasa, P. Abbeel, A. M. Dollar, Yale-CMU-Berkeley dataset for robotic manipulation research, The International Journal of Robotics Research, vol. 36, Issue 3, pp. 261 – 268, April 2017.

[2] T. Whelan, S. Leutenegger, R. Salas-Moreno, B. Glocker, A. Davison, ElasticFusion: Dense SLAM without a pose graph, Proceedings of Robotics: Science and Systems, July 2015