Multi-party TELE-IMMERSION (TI) refers an emerging technology that can support realistic inter-personal communications, allowing remote users to share an activity. This is achieved by generating in real-time the actual 3D “replicants” (realistic 3D reconstructions) of multiple geographically distributed users and placing them inside a common virtual space, where users can interact with each other and with the virtual world. Distant users are able to see in 3D their peers’ replicants and feel like they are together in the same place. 3D Tele-immersion is a Future Internet application that could alter the way people communicate and interact, removing the barrier of physical location and distance, and create new pathways in the industries of entertainment, gaming, advertisement, broadcasting, health, learning etc.
In real-time applications, such as TI, both realistic replications of the users appearance and natural interaction among geographically remote user, are required. This highlights the need to research both towards the real-time realistic 3D reconstruction of users and the efficient compression of their 3D representations, in order to allow scaling up the interaction among large number of users capable to support such exciting applications.
Tele-Immersion ski competition among users around Europe (Oct. 2014)
Our tele-immersion prototype allowed one indoor skier (on a ski simulator) in Greece, another indoor in Germany and an outdoor skier in Schladming, Austria to ski together and share in real-time their experience. The user in Greece, was captured and reconstructed in 3D, so that the user in Germany was able to see the performance of his 3D “replicant” (either in a Cave or in augmented reality glasses).
BBC covered the 3DLive platform’s ski scenario during the project final experiments! “A prototype sports simulator which allows users to race against real skiers in real-time is being developed.” You can watch it here.
Tele-immersion 3D games: “SpaceWars” (May 2015)
“SpaceWars” is a multi-player on-line 3D game prototype, developed by our lab, where remote players participate via their 3D reconstructions (“replicants”) and play using their body. The players participate in a Capture-The-Flag race, battling each other via Fireballs!
Immersive-reality games: “Castle in the forest” (Sep. 2013)
“Castle in the Forest” is a 3D game prototype, developed by our lab, showing that merging of real with virtual can provide an interesting and entertaining experience for the end user.The player is able to see her/his 3D “replicant” inside the game and interact with the virtual environment, a deep forest, with the purpose to find the path towards a castle. The player navigates inside the virtual world with special hand gestures.
The game was presented in European Researcher’ Night (@NOESIS – 27th of September 2013) , a mega event taking place every year on a single September night in about 300 cities all over Europe. Over than 1000 kids and adults tried our applications! Read the story here.
Real-time 3D reconstruction
The problem of real-time, robust and realistic 3D reconstruction of humans from multiple cameras is an important and challenging task for TI applications. Although several accurate 3D reconstruction methods from passive RGB cameras can be found in the literature, they are not applicable in TI applications, since they require a processing time of several minutes per frame. Most of the state-of-the-art approaches that target real-time immersive-reality applications, address mainly the problem of synthesizing intermediate views for given view-points, rather than generating a single complete 3-D surface, which can be rendered using standard computer graphics, and thus enabling 360-degress Free-View-Point navigation. We work on 3D reconstruction using multiple consumer-grade RGB-Depth cameras, such as Microsoft Kinect sensors. The objective is the per-frame creation of accurate, realistic, full 3-D reconstructions of moving humans, referred to as “replicants”, to be exploited in real-time applications.
D. Alexiadis, D. Zarpalas, P. Daras, “Real-time, full 3-D reconstruction of moving foreground objects from multiple consumer depth cameras”, IEEE Transactions on Multimedia, vol. 15, no. 2, pp. 339-358, Feb. 2013,IEEE Distinguished paper, IEEE MMTC R-Letter vol 4, no 3, June2013
A. Doumanoglou, D. Alexiadis, D. Zarpalas,P. Daras, “Towards Real-Time and Efficient Compression of Human Time-Varying-Meshes“, IEEE Transactions on Circuits and Systems for Video Technology, vol 24, no 12, Dec 2014
D. Alexiadis, G. Kordelas, K. Apostolakis, J. Agapito, J. Vegas, E. Izquierdo,P. Daras, “Reconstruction for 3D Immersive Virtual Worlds”, WIAMIS 2012: The 13th International Workshop on Image Analysis for Multimedia Interactive Services, 23rd – 25th May 2012, Dublin City University, Ireland
R. Mekuria, D. Alexiadis, P. Daras, P. Cesar, “Real-time encoding of live reconstructed mesh sequences for 3D Tele-Immersion”, IEEE International Conference on Multimedia and Expo (ICME 2013), San Jose, California, USA, 2013
D. Alexiadis, D. Zarpalas, P. Daras, “Real-time, Realistic Full-body 3D Reconstruction and Texture Mapping from Multiple Kinects”, 11th IEEE IVMSP Workshop: 3D Image/Video Technologies and Applications, Yonsei University, Seoul, Korea, 10-12 June 2013, best paper award
D. Alexiadis, D. Zarpalas, P. Daras, “Fast and smooth 3D reconstruction using multiple RGB-Depth sensors“, IEEE International Conference on Visual Communications and Image Processing, VCIP 2014, Dec 7-10, Valletta, Malta
D. Alexiadis, A. Doumanoglou, D. Zarpalas, P. Daras, “A case study for tele-immersion communication applications: from 3D capturing to rendering”, IEEE International Conference on Visual Communications and Image Processing, VCIP 2014, Dec 7-10, Valletta, Malta
S. Crowle, A. Doumanoglou, B. Poussard, M. Boniface, D. Zarpalas,P. Daras, “Dynamic Adaptive Mesh Streaming for Real-time 3D Teleimmersion”, 20th International Conference on Web 3D Technology, Heraklion, Crete, Greece, June 18-21, 2015