Goldsmiths XR Masters
😍17th May 2021 - back on Campus! 😍
23rd July XR Game Night - AR Competition
Award Jury Panel
Dream Reality Interactive
Director, UK & Ireland
Sachet of Weather
John - Your Smoking Buddy
Head of Immersive Technology
Dream Reality Interactive
Studio Creative Director
Group Work Entries
“Enchanting Fables” is a narrative and gamified first person immersive experience in VR for children. It brings the tradition and cultures of sharing folktales into the digital age through a virtual experience. Although designed for children, “Enchanting Fables” is also an embodiment experience for adults who may have heard these stories or versions of these stories in their cultures and can now become protagonists in these tales recounting the valuable moral lessons and legends that they would have heard of in their childhood.
The prototype has been designed around a Chinese folktale of a child who has been given a magic paintbrush in return for his act of kindness. The entire experience has been designed to keep the user engaged in the story through a crisp User Experience flow that divides the narrative into crisp scenes that immerses the child into the world. The controls have been kept simple and easy to use with an interactive hand menu.
Our biggest challenge was to reduce the impact of Simulator Sickness without the user having to take off their headset or without breaking the narrative flow. An innovative design of a resting tree has been incorporated into the narrative that allows the user to ‘rest’/ ‘pause’ without exiting the experience.
“Enchanting Fables” has great potential to be scaled up to include folktales with moral or educational contexts and enhance the imbibement of the message more deeply. The stories can be customised for language, regional and cultural nuances thus carrying forward the agnostic nature of the lessons embedded in these tales.
Markus Sauerbeck, Priyadharshni Krishnan, Miia Remahl, Majid Alturki
CPRVR is a health training virtual reality app that aims to teach the basics of performing CPR (Cardiopulmonary Resuscitation) in a playful and accessible environment.
Markus Sauerbeck & Miia Remahl
The past year was challenging for many people in different ways. Not only the fact of being separated from beloved ones due to different restrictions but also the limited opportunities to free your mind after a stressful day and stimulate yourself by a new experience. The Virtual Reality experience “Acid mind” brings in new visual and audio stimuli to your everyday life. Explore a world of colours in an interactive and relaxing way - safe, legal and without side effects - immersed in the virtual world.
The application is using the latest hand tracking technology and combines different techniques in shaders, VFX and sound to a playful experience. It can be seen as a virtual exhibition of interactive art pieces, manipulated by the user and generated with the Unity engine.
We learned to combine techniques of Virtual Reality, hand tracking, shaders, VFX as well as sound reactive shaders and VFX. For us it was a huge success what we measure with our ability of collaborating on the project via Teams and in person. We followed an MVP (minimal viable product) approach by combining individual developed pieces to one scene. As soon as one teammate found out exciting approaches like a sound reactive shader we also combined it with techniques from the other teammate.
Kevin Kuhn Agnes, Sasha Jiang, Yiru Yu, Ye Fu, Sachin Kodati
Ascend VR is a meditation application designed for a single HMD user that runs for approximately 10 minutes. It takes the user on a guided meditation, immersing the user in responsive virtual environments with 360 graphics, spatial sounds and haptic feedback.
Throughout the experience, the user is guided by a voice narration that delivers breathing and visualisation exercises. The user is guided through 4 scenes: Void, Ocean, Sky and Space. Beginning as a fish swimming to the surface of the ocean, then a seabird flying towards the horizon and finally a celestial entity in space, the journey of ascension is both physical and spiritual.
Targeting an audience who experience difficulties with traditional meditation methods, Ascend VR helps the user quickly settle into a meditative state through its seamless combination of immersive qualities and meditation techniques. Through the narrative, the user is meaningfully connected to virtual environment and more likely to achieve mindfulness.
I want to share my childhood nightmare in this project. I watched the movie "Deep Rising" accidentally when I was a little girl and it left an indelible horrible memory to me. In this project, I added Shader Graph, Flocking System, Post- Processing and the swimming elements in it. All these techniques built an underwater world and I used audio to enhance the atmosphere and tension.
Two of the most interesting techniques I learned this time are shader graph and post-processing. I applied the shader to fishes, all textures and animations are designed inside Unity, and their swimming speed, texture colors, flocking methods can all customized. The post-processing changed the game world became smoother and more beautiful, which can let players feel more immersed.
‘Non-Euclidean Room Tour’ is a physical walking and controller-free VR experience prototype that performs a time shift as well as a space transition simultaneously. Users can find any living room size space and experience interactions with their body, like playing instruments with their own hands and walking with feet.
Hands are the principal interaction, also the more difficult parts to work out. Based on Oculus Quest’s hand tracking package, hand tracking and gesture detection were implemented. Even though there is no physics haptics, prompt feedback gives different levels of tactile illusion to different testers. By introducing the concept of ‘Non-Euclidean’, the virtual room was compressed by 3 spaces to fit into one room-scale experience. The implementation was technically based on the stencil buffer. It is designed to implement impossible geometry illusion in the maze, also as a solution that solves the space problem posed by walking. Functions in my previous project are redesigned, hand menu, grabbing, drawing and portals are upgraded to more exquisite interactions.
The goal of this prototype is to give users a more intuitive feeling. During the process, two questions arise: Is walking the locomotion way that will not break the illusion? To what extent walking instead of the controller movement can significantly reduce symptoms of motion sickness? This project gives a very positive answer. The project practices the concept of a potentially infinite world illusion. In terms of haptics, a Wireality with worn multi-string hands haptics potentially is a good solution for the project afterwards.
The aim was to create an interactive experience that lets people connect with classical art pieces, such as Vincent vanGogh’s “Bedroom in Aries”. With all the opportunities offered by the technology both from the creative and technical side, I delivered a piece with far-reaching innovative storytelling through the use of the immersive VR medium.
Experimenting with gaze interaction had been viewed positively in a development process it was used for “magical” scene transformation between the gallery space and the visited painting. The sound and environment reacting to the player’s input is creating a perspective of releasing the time that has been stopped on an original 2D painting
With the small user testing between my housemates, I collected some valuable feedback and measured the impact of the experience on first-time VR users, which showed promising results for further future development of the deep place and plausibility illusion.
The Temple of Illusions
The Temple of Three Illusions is an exploration of optical illusions within virtual reality. Set in a Greek temple enclosed within in a mystical forest, the audience is invited to observe and experience three optical illusions. The idea stems from the notion that virtual reality is itself an illusion (of place, plausibility and presence) made possible via our sensory organs and computer graphics. I’m interested in seeing optical illusions in reality transfer to virtual reality, but especially the moments when certain physiological illusions experienced cannot be digitally documented. There is the philosophical ponder into the duality of realness: what you have experienced is undoubtedly real, but has no place in reality - even simulated ones. As we strive for increasingly realistic simulations, it is worth taking the occasional step back to consider the perceptual nature of our experiences in virtual reality.
Save the Dogs
House Party VR
Meet at your home – worldwide.
Covid kept us away from spending our time with our beloved ones. With House Party VR you can invite your friends to your digital – real home and bring back the feeling you may missed for a long time. This application is a social-VR multiplayer concept. The user scans in his real environment by photogrammetry and within seconds he is able to invite friends from all over the world to it. By placing games and interactivities the user can benefit of all affordances from Virtual Reality. So: let’s start the party and connect!
For this application I used an – for me – entire new game engine (Unreal Engine) what was a huge benefit for me in terms of graphic quality and development speed. For coding I learned the node based coding of blueprints what felt much faster compared to a Unity development. I tried out different available photogrammetry apps that are working with the lidar scanner and received low poly model as well as high detailed output (e.g. with Reality Capture). I assume that this proposed workflow of bringing in real environment within your own virtual space will become soon a standard for social AR/VR (especially with further software developments with the Lidar scanner software).
4Data was initially a sample collection of demo scenes that aimed to explore various data visualizations and narratives in VR. After further development, the project ended up with a focus to explore a more specific visualization technique, which was scatterplots. The experience in 4Data allows the user to explore a dataset that studies the relationship between temperatures and wind speeds on the ozone layer.
This project aims to help people release their pressures in real life. The basic idea why to build an immersive scene like a dream is people can feel fully relaxed in the dream, so they can release the pressure without any concern. The player in the scene can smash the texts and other objects by pressing the trigger and grip button on the controller (the button in front of the controller and the button on the side of the controller) – player can shoot by pressing the button in front of the controller and throw by pressing the button on the side of the controller. Also, make sure there is no more unexpected pressure on the operation, the speed and angle of the bullet were already calculated so the player does not need to shoot the objects with a very accurate angle.
Every object in the scene has its meaning like the 3D Texts represent every single pressure in the real-life; The Cat represents yourself in the past or future; The Heart shows in the end, represents your determination and the Broken Screen indicates it is the end of the dream, you should wake up to face the real life.
The main technology in this project is custom-made Shaders, like the ripple effect on the wall and ground when you touch them, the reflection effect, the out light glow effect, the smashing effect, the broken screen (Post Processing after Camera Rendering). The ripple effect and the smashing effect also including the interaction with the player.
Plastizoic VR Experience
This experience is aiming to raise awareness among the audience. Plastic pollution is one of the biggest problems of society. I do believe that by using a VR medium and unconventional storytelling the issue can become more approachable and understandable by the audience. Data visualization in dimensional space is a powerful way to show abstract ideas in order to create awareness of what stands behind the numbers. Using non-linear storytelling the user is free to explore the space, interact with the objects and live within the narration, rather than be a passive recipient. From the technical point of view, the precision of on trigger events was very important, so the user can understand the message clearly without any disruption. I experimented with and implemented Nvida Flex Physics Beta for real-time interactions with particle-based simulation for the "plastic ocean" bubbles, that intensify the subliminal message.