The 360 dataset provides 360° color images of indoor scenes along with their corresponding ground truth depth annotations. This dataset is composed from renders of other publicly available textured 3D datasets of indoor scenes. Specifically, it contains renders from two Computer Generated (CG) datasets, SunCG, SceneNet, and two realistic ones, acquired by scanning indoor building, Stanford2D3D, and Matterport3D. The 360° renders are produced by utilizing a path-tracing renderer and placing a spherical camera and a uniform light source at the same position in the scene.
More information can be found @ http://vcl.iti.gr/360-dataset/
N. Zioulis, A. Karakottas, D. Zarpalas, P. Daras, “OmniDepth: Dense Depth Estimation for Indoors Spherical Panoramas”, European Conference on Computer Vision (ECCV), Munich, Germany, 8 – 14 September 2018