TY - GEN
T1 - Magic bench-A multi-user & multi-sensory AR/MR Platform
AU - McIntosh, Kyna
AU - Mars, John
AU - Krahe, James
AU - McCann, Jim
AU - Rivera, Alexander
AU - Marsico, Jake
AU - Israr, Ali
AU - Lawson, Shawn
AU - Mahler, Moshe
N1 - Publisher Copyright: © 2017 Copyright held by the owner/author(s).
PY - 2017/7/30
Y1 - 2017/7/30
N2 - Mixed Reality (MR) and Augmented Reality (AR) create exciting opportunities to engage users in immersive experiences, resulting in natural human-computer interaction. Many MR interactions are generated around a first-person Point of View (POV). In these cases, the user directs to the environment, which is digitally displayed either through a head-mounted display or a handheld computing device. One drawback of such conventional AR/MR platforms is that the experience is user-specific. Moreover, these platforms require the user to wear and/or hold an expensive device, which can be cumbersome and alter interaction techniques. We create a solution for multi-user interactions in AR/MR, where a group can share the same augmented environment with any computer generated (CG) asset and interact in a shared story sequence through a third-person POV. Our approach is to instrument the environment leaving the user unburdened of any equipment, creating a seamless walk-up-And-play experience. We demonstrate this technology in a series of vignettes featuring humanoid animals. Participants can not only see and hear these characters, they can also feel them on the bench through haptic feedback. Many of the characters also interact with users directly, either through speech or touch. In one vignette an elephant hands a participant a glowing orb. This demonstrates HCI in its simplest form: A person walks up to a computer, and the computer hands the person an object.
AB - Mixed Reality (MR) and Augmented Reality (AR) create exciting opportunities to engage users in immersive experiences, resulting in natural human-computer interaction. Many MR interactions are generated around a first-person Point of View (POV). In these cases, the user directs to the environment, which is digitally displayed either through a head-mounted display or a handheld computing device. One drawback of such conventional AR/MR platforms is that the experience is user-specific. Moreover, these platforms require the user to wear and/or hold an expensive device, which can be cumbersome and alter interaction techniques. We create a solution for multi-user interactions in AR/MR, where a group can share the same augmented environment with any computer generated (CG) asset and interact in a shared story sequence through a third-person POV. Our approach is to instrument the environment leaving the user unburdened of any equipment, creating a seamless walk-up-And-play experience. We demonstrate this technology in a series of vignettes featuring humanoid animals. Participants can not only see and hear these characters, they can also feel them on the bench through haptic feedback. Many of the characters also interact with users directly, either through speech or touch. In one vignette an elephant hands a participant a glowing orb. This demonstrates HCI in its simplest form: A person walks up to a computer, and the computer hands the person an object.
KW - Augmented Reality
KW - Haptics
KW - Immersive Experiences
KW - Mixed Reality
KW - Real-Time Compositing
UR - http://www.scopus.com/inward/record.url?scp=85033410051&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85033410051&partnerID=8YFLogxK
U2 - 10.1145/3089269.3089272
DO - 10.1145/3089269.3089272
M3 - Conference contribution
T3 - ACM SIGGRAPH 2017 VR Village, SIGGRAPH 2017
BT - ACM SIGGRAPH 2017 VR Village, SIGGRAPH 2017
PB - Association for Computing Machinery, Inc
T2 - ACM SIGGRAPH 2017 VR Village - International Conference on Computer Graphics and Interactive Techniques, SIGGRAPH 2017
Y2 - 30 July 2017 through 3 August 2017
ER -