VR-Together

VR-Together

Funding Organization: European Commission
Funding Programme: Horizon H2020
Funding Instrument: Innovation Action
Start Date:
Duration: 36 months
Total Budget: 3,929,937 EUR
ITI Budget: 509,000 EUR

An end-to-end system for the production and delivery of photorealistic and social virtual reality experiences (2017: H2020 ΙΑ)

VR-Together will offer new ground-breaking virtual reality experiences based on social photorealistic immersive content, which can be experienced together with friends, and demonstrate its use for domestic VR consumption. For this purpose, it will develop and assemble an end-to-end pipeline integrating state-of-the-art technologies and off-the-shelf components. Immersive media production and delivery will be achieved through innovative capture, encoding, delivery and rendering technologies.

The challenge of VR-Together is to create photorealistic truly social virtual reality experiences in a cost-effective manner. VR social experiences will be delivered by the orchestration of innovative media formats (video, a blend of videos, point cloud representations and 3D mesh interpolation). The production and delivery of such experiences will be demonstrated and analysed through dedicated real-world trials. Furthermore, the scalability of its approach for production and delivery of immersive content will be demonstrated as well, where such content will be consumed using off-the-shelf (OTS) hardware components and scalable cloud services (i.e. large community of users, in everyday living rooms, using OTS equipment).

Finally, it will introduce new methods for social VR evaluation and quantitative platform benchmarking for both live and interactive content production, thus providing production and delivery solutions with significant commercial value.

Publications:
  1. S. Thermos, G. T. Papadopoulos, P. Daras and G. Potamianos, "Deep Sensorimotor Learning for RGB-D Object Recognition" , Computer Vision and Image Understanding (2020), 190, 102844. DOI: https://doi.org/10.1016/j.cviu.2019.102844
  2. V. Sterzentsenko, L. Saroglou, A. Chatzitofis, S. Thermos, N. Zioulis, A. Doumanoglou, D. Zarpalas, P. Daras, "Self-Supervised Deep Depth Denoising" , In Proceedings of the International Conference on Computer Vision, Seoul, Republic of Korea, October 27 - November 2, 2019.
  3. K. Christaki, K. Apostolakis, A. Doumanoglou, N. Zioulis, D. Zarpalas, P. Daras, "Space Wars: An AugmentedVR Game" , 25th International Conference on MultiMedia Modeling (MMM), Thessaloniki, Greece, January 8-11, 2019.
  4. Kyriaki Christaki, Emmanouil Christakis, Petros Drakoulis, Alexandros Doumanoglou, Nikolaos Zioulis, Dimitrios Zarpalas, and Petros Daras, "Subjective Visual Quality Assessment of Immersive 3D Media Compressed by Open-Source Static 3D Mesh Codecs" , 25th International Conference on MultiMedia Modeling (MMM), Thessaloniki, Greece, January 8-11, 2019.
  5. A. Chatzitofis, D. Zarpalas, S. Kollias, P. Daras, "DeepMoCap: Deep Optical Motion Capture Using Multiple Depth Sensors and Retro-Reflectors" . Sensors (2019), 19(2), 282, Special Issue: Depth Sensors and 3D Vision. https://doi.org/10.3390/s19020282
  6. A. Karakottas, A. Papachristou, A. Doumanoglou, N. Zioulis, D. Zarpalas, P. Daras, "Augmented VR" , IEEE VR, Reutlingen, Germany, 18-22 March 2018, https://www.youtube.com/watch?v=7O_TrhtmP5Q
Contact:
Dr. Petros Daras
Visit the website:
vrtogether.eu
Visual Computing Lab

The focus of the Visual Computing Laboratory is to develop new algorithms and architectures for applications in the areas of 3D processing, image/video processing, computer vision, pattern recognition, bioinformatics and medical imaging.

Contact Information

Dr. Petros Daras, Principal Researcher Grade Α
1st km Thermi – Panorama, 57001, Thessaloniki, Greece
P.O.Box: 60361
Tel.: +30 2310 464160 (ext. 156)
Fax: +30 2310 464164
Email: daras@iti.gr