First and Final Round of Judges Announced For Vision VR/AR Awards 2016
Finalists for the Vision VR/AR Awards 2016 have already been announced on January 12, and this week, Vision Summit team had hand-picked a group of judges. These judges come from every corner of the virtual and augmented reality industry, from venture capitalists to developers to hardware manufacturers. They are in the VR/AR trenches every day, so they are uniquely qualified to identify the best content that the development community has to offer.
First Round Judges: This panel of judges was responsible for narrowing down the nearly 250 submissions we received to the selection of 21 finalists that have been announced. Their expertise was invaluable in determining which submissions would go on to the final round.
Final Judges: The second panel of judges was charged with the formidable task of selecting the winners from the 21 impressive finalists. Fortunately, these industry veterans were up to the task – tune in to the Vision VR/AR Awards Ceremony on the evening of Wednesday, February 10th to learn which projects they chose as winners.
Vision Summit 2016 Advisory Board
The Vision Summit 2016 Advisory Board is a distinguished group of VR/AR industry leaders whose expertise and guidance steer the conference program. The Advisory Board is responsible for ensuring that the content being presented at Vision Summit is relevant and engaging to the conference attendees and worldwide audience, and that it represents the latest in industry news and technology. We appreciate their dedication and commitment to guaranteeing the quality of content at Vision Summit 2016.
Additional Agenda @ Vision Summit
Augmenting Space Exploration with VR/AR by Jeff Norris and Victor Luo (NASA JPL)
Virtual and augmented reality promise to transport us to places that we can only imagine. When joined with spacecraft and robots, these technologies will extend humanity’s presence to real destinations that are equally fantastic.
NASA’s Operations Laboratory at JPL is spearheading several ambitious projects applying virtual and augmented reality to the challenges of space exploration. Through partnerships with multiple VR and AR companies, scientists on the Curiosity Mars Rover mission are exploring the martian terrain, engineers are experimenting with new ways to control the Robonaut humanoid, and astronauts on the International Space Station are preparing to perform their work more efficiently than ever before.
The leads of these projects at NASA will share their progress so far, the challenges that lie ahead, and their vision for the future of VR and AR in space exploration.
Build Architectural and Gaming Environments That Create Presence in VR by Carl Callewaert (Unity Technologies)
This presentation begins by briefly explaining how the human brain perceives 3D and virtual reality. With this knowledge as a foundation, we will then learn to use this in building architectural and gaming environments that create the sense of presence and immersion. Each topic will focus on examples based on real and virtual worlds.
After the presentation you will be able to use this knowledge to create a strong sense of presence within room-scale VR environments, enabling you to boost the “Genius Loci” of your virtual worlds.
Building Better Worlds: Leap Motion Co-founder/CTO on the Arrival of VR by David Holz (Leap Motion)
The human hand is the original user interface for the physical universe, and soon it will be the fundamental interface for countless virtual worlds. Leap Motion’s hand tracking technology has made some massive advances with the dawn of VR. CTO and Co-Founder David Holz will speak on the cutting edge of VR input, including some next-gen software developments.
The challenge, as we’ve learned since launching Cardboard in 2014, is that “everyone” includes an awful lot of people, and not all of them own the latest and greatest phone. This talk will discuss the techniques that we’ve developed to deliver compelling VR experiences across a wide range of mobile hardware, including single-pass distortion correction, real time 360 video compositing, drift correction, and believable binaural audio while giving attendees specific instructions on how they can apply them to their own Unity VR projects.
The main focus is on Bravemind, a clinical, interactive, virtual reality (VR) based exposure therapy tool being used to assess and treat post traumatic stress disorder (PTSD). The assessment and treatment of PTSD is a major concern to the military because stressful experiences in today’s war-fighting environments have resulted in a significant number of soldiers returning from combat deployments at risk for developing PTSD.
Bravemind was developed to address this challenge by offering a means by which to overcome the natural avoidance tendency of trauma sufferers. The potential of using VR for the treatment of PTSD is supported by previous reports in which patients with PTSD, who were unresponsive to previous imaginal Prolonged Exposure (PE) therapy treatments, went on to respond successfully to Virtual Reality Exposure Therapy (VRET).
Bravemind allows clinicians to gradually immerse patients into virtual environments representative of their traumatic experiences in a controlled, stepwise fashion by providing the capability to control multi-sensory emotional stimuli and monitor the intensity of the patients’ stress responses via advanced psychophysiological assessment techniques.
The use of VR technology offers unique capabilities for the treatment of PTSD not only because it allows interactive, multisensory, immersive environments to be readily created that can be tailored to a patient’s needs, but also because it provides the ability for clinicians to control, document, and measure stimuli and patient responses, offering clinical assessment, treatment and research options that are not available via traditional methods.
As such, Bravemind not only provides a tool for clinicians in the treatment of PTSD patients but also allows them to measure, document, and learn from the results in order to better understand the brain and biological factors that serve to inform the prevention, assessment and treatment of PTSD.
Pillars of Presence: Amplifying VR Immersion by Vincent Hamm and Ben Padget (Oculus)
The mission of Oculus is to enable people to experience anything, anywhere. Building Rift, the team discovered several key factors that multiply the phenomenon of “presence” – that feeling as if you’re truly there inside a virtual world.
In this developer talk, attendees will hear about the important issues in graphics, audio, input, video, and collaboration that will maximize presence and engagement in VR using Unity. The presenters will cover each topic at a high level and introduce tools within Unity and the Oculus platform with concrete examples to help developers get started.
In this talk, we’ll explain the features of this SDK, including VR SLI, multi-resolution shading, context priorities and direct mode. We’ll discuss the motivation for these features, how they work, and how developers can use GameWorks VR in their engines to improve the VR experience on Oculus Rift, HTC Vive, and other VR headsets.
Using AR Effects to get creative with reality by Robert McCain (Sony Mobile)
n this session, Robert McCain (Senior Developer Program Engineer from Sony Mobile) discusses the opportunity that AR presents every time you launch your camera. Find out how you can help users create a new reality using their camera and AR Effects. You’ll get an intro to AR Effects, an understanding of the APIs available and find out about potential use cases.