This work showcases the majority of research done towards achieving immersive interaction of users within a networked virtual environment. Presented within the context of an online collaborative virtual dance masterclass application, milestones achieved towards reaching the seamless blending of real with virtual information includes the automatic, real-time on-line reconstruction of users within the shared virtual scene, virtual character embodiment and full-body control via skeleton tracking as well as environmental reaction and feedback to user interventions in the form of a dance routine evaluation process. Along these lines, real-time rendering of 3D content with standard web 3D graphics technology, capturing and recording RGB-D data over the web, auto-stereoscopic post-processing effects for generating glasses-free 3D on specialized displays and spatial 3D sound have been addressed as part of this research work. With the integrated demo developed for the 3DLife project, we demonstrate how research work on the aforementioned fields has matured, and along with the use of consumer-grade hardware such as the Microsoft Kinect sensor, can be employed for wide-spread application of web-based 3D Tele-Immersion systems.
Visit our live application
Bringing the Media Internet to Life