Workshop on Augmented and Mixed Reality: Demos

Meta 2’s game-changing breakthroughs in optics include a full 90-degree field of view and 2560 x 1440 high-dpi display. The see-through headset makes everything below your eyebrows completely transparent and unobstructed so you can easily make eye contact with others. You can even wear the headset while wearing glasses. The unique neuroscience-driven interface design principles allow you to access, manipulate and share digital information easily and naturally. We call this The Neural Path of Least Resistance™, a new zero-learning-curve approach to computing.
Nod Labs is a Silicon Valley technology startup with deep domain knowledge in motion tracking and computer vision. They have developed many innovative products, including the Nod Ring, Backspin and more recently, Project Goa, which enables fully immersive, 6 degrees of freedom (6DoF) tracking for virtual and augmented reality (VR/AR). Besides unsurpassed positional accuracy, Nod's tracking technology is easy to setup, scalable and very cost competitive. On the software side, Nod provides SDKs and plug-ins, that enable VR/AR content developers to create compelling experiences for end-users.
Project Alloy, an all-in-one virtual reality solution leveraging Intel RealSense technology. Project Alloy will be offered as an open platform. The Alloy HMD is an example of how Intel’s suite of sensing and computing technologies, such as Intel RealSense technology, are being made available to developers, makers and inventors to deliver the future of immersive experiences.
Tango is a set of sensors that enables augmented reality (AR) experiences on your smartphone, including navigation, utilities, and gaming. All you have to do is look through your phone, and you’ll see objects and information overlaid on the real world—from directional arrows inside a store or museum, to real-time measurements of physical objects, to hordes of attacking zombies. Through its motion tracking, depth perception and area learning capabilities, Tango enables three main smartphone AR experiences: gaming, retail and spatial utilities.
uSens Inc. creates intuitive 3D HCI solutions for ARVR. As a domain expert in 3D graphics, Computer Vision and Artificial Intelligence, uSens is one of the first companies to provide inside-out, 26 DOF hand, 6 DOF head tracking, and mixed reality for both mobile and tethered ARVR systems, allowing end users to immerse themselves in the most naturally interactive digital experience. www.usens.com
Epson is providing a new way to see the world through the Moverio BT-300 Smartglasses. The groundbreaking Si-OLED display enhances AR experiences by delivering unparalleled levels of transparency. The binocular display allows you to view Side by Side 3D content or comfortably view 2D content in the center of your field of view. Powered by an Intel Atom chipset and Android 5.1, the BT-300 is a powerful platform for your AR applications.
Human performance enhancement and training/education are widely considered ideal use cases for augmented reality. Learning to play piano involves not only coordinated actions and an awareness of spatial orientation, but also involves repeated drilling at increasing difficulty levels to acquire skill over time. This application represents the new frontier for mixed-reality: content creation. This demo will show off the browser-based RealityFlow editor using the smartphone-based Seebright Ripple 2 HMD.
TruLife Optics offer a variety of full colour holographical optical elements for mixed reality near to eye optics and eye tracking. Our holograms are clear and thin and so can be laminated onto spectacle glasses to offer AR and eyesight correction. We will be demonstrating two new setups at Stanford. Firstly direct retinal projection in full colour using a freespace hologram and a Microvision projector. Secondly a holographic version of a diffractive optics AR setup with a holographic 45° mirror and holographic spherical mirror. We will soon have a mass production capability as well as design, modeling and mastering of HOE’s.
All commercial headsets available today share one fundamental flaw: the vergence-accommodation conflict, which may cause eye strain, headaches, and visual discomfort. At Lemnis Technologies, we restore the natural accommodative response in the user's eyes by using a computational display with adjustable focus. We build platform-independent software and hardware modules and demonstrate the technology integrated in a prototype headset.
Each year, over 170,000 women undergo lumpectomy but in 25% of cases, the surgery fails and must be repeated. We have developed a mixed-reality system that reveals the location and extent of breast cancers inside the actual patient. Our approach is that the surgeon will use this system to plan their excision, and ultimately improve lumpectomy so that it is a much more definitive surgery. Our demonstration will feature actual patient data and the software that is currently undergoing initial validation studies by surgeons in the operating room, in patients with palpable breast cancers.
Create|AR will be demonstrating two experiences. Both experiences leverage their patent-pending intuitive user interface systems. The first demonstration is a contextual learning application, in which you learn and interact in AR. The second demonstration is an example of the world's first global, point of sale Augmented reality experience. With high fidelity visuals and a strong attention to detail, the experiences are sure to leave a lasting impression and inspire thought around AR applications.
Occipital will be demonstrating Bridge, which uses a Structure Sensor and a mobile device to generate positionally tracked, mixed reality experiences using a dense 3D reconstruction of the user’s environment
Augmented Pixels, Inc., based in Palo Alto, California, provides simultaneous localization and mapping (SLAM) software for augmented reality(AR/VR Glasses) and autonomous navigation(robots/drones).
http://augmentedpixels.com
Optotune designs and manufactures variable focus liquid lenses and compact 2D tiltable mirrors. Our tunable lenses benefit AR/VR applications by elegantly solving the accommodation/vergence conflict and also provide a compact solution for spherical refraction correction (for users wearing prescription glasses). Our 2D mirrors are compact, fast, accurate and yet low cost. Their benefit for AR/VR is the possibility to expand the perceived FOV by projecting light towards the eye’s pupil (in combination with eye tracking). Optotune will demonstrate a selection of lenses and mirrors along with optical design concepts for their integration.
We demonstrate a real time panoramic streaming system. Using optical flow based stitching, the system is capable of producing 4kx3k panoramic 3D video using a single Nvidia card with very low latency while combining dynamic scene and a static background like a 3D model or map to create an immersive experience.
The PhaseSpace Smoke VR/AR viewer was developed for the US Navy in 2010, and demonstrated at SIGGRAPH, IEEEVR 2012, and shown on International TV by UNC Professor Henry Fuchs, receiving excitement for it's simple design and excellent performance. Low cost (under $100) viewers will be available this summer for researchers and Augmented Reality software developers.