Integrated Space
Summary
Integrated Space research explores the use of real-time computer vision techniques and a pair of standard computer cameras to provide 3D human body awareness in an inexpensive, immersive environment system.
Project Team
Faculty:
Alan Price, Department of Design
Student:
Paulo Gotardo, Department of Electrical Engineering
Project Description
Integrated Space research explores the use of real-time computer vision techniques and a pair of standard computer cameras to provide 3D human body awareness in an inexpensive, immersive environment system. This project combines stereo vision and stereo projection to allow for both the user and the virtual scene to become aware of each other’s 3D presence as part of a single, integrated 3D space. The goal is to enhance the user experience of immersion in a virtual scene that is displayed by a 3D screen. The focus is on enabling authoring applications based on the direct manipulation of virtual objects, with users interacting from a first-person perspective. This emphasis contrasts with the avatar-based, mostly reactive focus often employed in the design of computer game interfaces.
This work is part of an effort to develop low-cost solutions that provide interaction designers with means to prototype ideas easily and in anticipation of future releases of similar technology in the mainstream of HCI applications.
Interated Space is a prototype system resulting from investigations on authoring applications in immersive environments. It requires no tracking markers or special control devices, allowing passers-by to immediately begin interaction with applications. Only the use of polarized glasses is required for the perception of the 3D output. The system comprises a combination of inexpensive, off-the-shelf hardware and software that is reproducible, reconfigurable, and expandable to accommodate new technological advances as they become readily available.
Despite current limitations in real-time computation, the system shows that computer vision, associated with 3D projection, can significantly enhance interaction, visualization, and the user experience of immersion. Future work will address hand gesture interpretation and multi-user interaction and collaboration.
Completed in July 2010.
Filters: Interactive Media, Simulation Lab