What if you, being here, could be a virtual presence anywhere?
Project Anywhere is an intention to breach the limits of physical human presence in space, by replacing kinaesthetic, visual and auditory with artificial sensory experiences, in a fully interactive virtual environment.
Blending aspects of both actuality and virtuality, the project creates the first vivid presence in an augmented reality space. The ubiquity of processors and the advent of cloud computing has made possible the decentralised aggregation of independent node systems in a common network, bringing together people from the most remote places to coexist in the same virtual space. Exploiting technology, project Anywhere, furthermore presents the potentiality of the digital as a final spatial architectural object in itself, instead of its use as an intermediary form or interface, while concurrently displaying the capacity of designing such spaces with an embedded dimensionality of time. The project’s title, “Anywhere”, refers to the generic nature of the digital and thus the virtual environment that can be produced. The digital landscape, or meeting place, can be made to simulate any physical context: real or imaginary; related or not to the subjects’ physical surroundings.
The project’s aim is to experiment with heretic notions of space and time [heterotopy, heterochrony], and the sensory experiences that could be produced by replacing actual senses with designed ones.
On the technical side, the main part of the project is a mobile application. It functions as a decentralized network node, and the portal for each player to join the virtual environment. Mounted on a mask, the phone provides a stereoscopic viewport for each subject, while simultaneously receiving, synchronizing and formalizing an array of data from a web cloud in real time. Subjects can join from any device connected to the internet.
In parallel, a sensor layer quantizes physical human presence in space. Omnitracker is a software developed for that, involving wireless and real-time, body skeleton and hand gesture tracking of a total of 83 degrees of freedom, from a local sensor layer that includes infrared space scanning sensors and the Inteliglove system.
83 + 3 degrees of freedom coming from the subject’s phone and the sensors, fuse to animate a digital Avatar. Any movement of the subject in physical space, such as a slight head tilt, a finger movement, or walking, corresponds to a correlated movement of the avatar in an 1:1 relationship in the virtual space, in real time. Furthermore, hand gestures can be programmed to perform a variety of actions in the virtual context, proving that a virtual presence can also be an active one.
Reuters / Video interview / Project Anywhere takes virtual reality gaming to new level,2/2015
Concept and development: Constantinos Miltiadis
Developed at the Chair for CAAD, Prof. Ludger Hovestadt, 2014