Synthesizing reality for realistic physical behavior of virtual objects in augmented reality applications for smart-phones Conference Paper uri icon

abstract

  • This paper presents a framework for augmented reality applications that runs on smart mobile phones and enables realistic physical behavior of the virtual objects in the real-world. The used mobile phone is equipped with two cameras and provides stereo images and live video. The two images are used to reconstruct a 3D representation of the real world, which is good enough to enable physical interaction between the virtual and the real-world objects, but not fine enough to synthesize the real-world view for back projection. The visual synthesis of the real-world view is done through the video stream. The constructed 3D representation is registered in the real-world view and used to place the virtual objects, determine their physical behavior, and detect collision with objects in the realworld view. The 3D reconstruction is not performed at each frame, but applied when necessary based on the position of the dynamic objects. Pose estimation is determined based on the movement of the mobile phone (gyroscope and accelerometer) and the viewed images. A physics engine, which utilizes the gravity vector obtained from the accelerometer sensor of the mobile device, is integrated into our framework. The physics engine equips virtual objects with realistic physical behavior.

publication date

  • March 18, 2013