Goldsmiths XR Masters
😍17th May 2021 - back on Campus! 😍
23rd July XR Game Night - AR Competition
Award Jury Panel
Dream Reality Interactive
Director, UK & Ireland
Yiru Yu, Zihan Chen, Pablo de Miguel, Eden Chahal, Tian shi Xie
Our project called ‘Pocket Theatre’ which is based on ‘Augmented Book’ theme. We chose ‘Hamlet’ as a background story and generated 3 versions of it – ‘Old’, Contemporary’ and ‘Future’. All the versions have 2 main character models – Hamlet and Ghost – random weather system and one menu which includes several draggable objects, like trees, eagles, rocks, etc.
The ‘Old’ one is the traditional version of ‘Hamlet’ with traditional dress and dialogue. Also, the physical scene is made like the traditional theatre scene by pop-up paper craft.
The ‘Contemporary’ one is the modern type of ‘Hamlet’. Different from the traditional one, the characters of this version are dressed like the modern people and the method of the speaking way is also as same as the modern people. The physical scene is also made by the pop-up paper craft but in a modern style – it is a tall building and characters will be showed on the roof of the building.
The ‘Future’ version, just like the name, it aims to show a future scene so the trackable image is the digital image with ultra-modern graphic. Also, the models of the character show the future scene – the model of Hamlet is dressed like an astronaut and the model of the ghost is in a transparent shape.
Sachet of Weather
Sachet of Weather is an AR mobile APP. The design’s aim is to help people seek different poetic meanings of different wethers in real life by an AR APP.
First, the user can complete the information of the current weather and their mood by a simple graffiti on the emotion card. Next, an opportunity is gained to interact with the virtual character called May who appears on the card plane when the players scanning the card with the phone.
The players will meet May of different looking and get some classic Chinese ancient poems sent by May every time scribbling down different combinations of weather conditions and moods.
John - Your Smoking Buddy
Chun-Jou Yu, Jagoda Wrobel, Markus Sauerbeck
“John“ is a subconscious non-smoking campaign in Augmented Reality. Your smoking buddy called John appears on a cigarette box as a virtual character by hovering the package with a AR supported device. He seems gruff and curt, but he is actually a very kind person but also has a heavy smoking problem.
After telling you a bad joke or stories about his smoking, John is asking you for another cigarette. The user can press a button to comply with his request. The user’s action instantly changes John’s health status what is reflected on a health bar on the top and an increased coughing by John. After several bad jokes given cigarettes by the user and an increasing cough, John dies in the end by giving John ”just one other cigarette“.
Catarina Rodrigues, Ethan Rutherford, Fang Ma, Yu Liao, Yishuai Zhang
‘Sentient Aura’ is an interactive AR facial masking project experimenting on the relationship between human emotions conveyed by various facial expressions and computer-generated 3d sculptures as facial masks. In the modern age, the non-stopping acceleration and expansion of our social sphere has gradually deprived us of our ability to concentrate on, take care of or even notice other people’s emotions. This project intends to create a series of filters that could amplify our emotion, hence arouse our empathy during a face-to-face communication. As if in the post-human world, empathy becomes the most important element during any sort of communication. ‘Sentient Aura’, therefore, is a tentative approach to speculate on the futuristic form of face-to-face communication.
We have encoded several overarching emotions, such as happiness, anger, surprise, sadness and indifference into the animated facial masks.The dynamic flow of the emotional energy is reflected by three specially sculpted post-humanist masks that are animated simultaneously with different facial expressions, partially veiling the real face while exposing our emotion in externalised forms. The 3D masks are inspired by oriental warriors’ masks and animals such as lizards, octopus and alien-like critters. In the post-humanist era, cybernetic symbiosis with plants and other animals has prevailed as the mainstream.
Through the technology of AR, a real-time masking process is enabled to vividly illustrate a 3D sculpted mask over a real human face. It uses AR as the intermediary to bridge an imaginary posthuman world and the current reality.
Majid Alturki, Kevin Kuhn Agnes, Miia Remahl, Noa Geras
The whole history of humanity is an enormous accumulation of data, and information has been stored for thousands of years. Data has become an integral part of history, and today it is also an integral part of our social lives. Nowadays databases are readily available for the public and the highest bidder to use.
Data processing is becoming more widely used, with institutions claiming to be "data-driven" by the day. According to International Data Corporation 70% of organizations already purchase external data, and it was expected to reach 100% by the beginning of 2019. Data volumes have grown exponentially from the period of 2013 to 2020 in which worldwide data increased from 4.4 to 44 zettabytes. Such fast data expansion may result in numerous ethical concerns and technical challenges. There aren't sufficient protections to ensure that this data won't be misused. There is an enormous potential to misuse individual's data and cause real harm. People are more easily manipulated than we like to admit, and the unprecedented volume of personal data being collected makes it possible for governments to manipulate us in ways that are difficult to predict or even detect.
New technologies enter our lives very easily as well, and Augmented Reality is no exception. With the current trend that data is being treated as, who knows what a data embedded AR future holds. That is why we created Datadiver; a prototype that aims to raise awareness on the potential invasiveness of personal data collection and usage in the augmented future.
Head of Immersive Technology
Dream Reality Interactive
Studio Creative Director
Group Work Entries
Himani Kumar, Sijia Zhai, Fang Ma, Kexin Xie
“Enchanting Fables” is a narrative and gamified first person immersive experience in VR for children. It brings the tradition and cultures of sharing folktales into the digital age through a virtual experience. Although designed for children, “Enchanting Fables” is also an embodiment experience for adults who may have heard these stories or versions of these stories in their cultures and can now become protagonists in these tales recounting the valuable moral lessons and legends that they would have heard of in their childhood.
The prototype has been designed around a Chinese folktale of a child who has been given a magic paintbrush in return for his act of kindness. The entire experience has been designed to keep the user engaged in the story through a crisp User Experience flow that divides the narrative into crisp scenes that immerses the child into the world. The controls have been kept simple and easy to use with an interactive hand menu.
Our biggest challenge was to reduce the impact of Simulator Sickness without the user having to take off their headset or without breaking the narrative flow. An innovative design of a resting tree has been incorporated into the narrative that allows the user to ‘rest’/ ‘pause’ without exiting the experience.
“Enchanting Fables” has great potential to be scaled up to include folktales with moral or educational contexts and enhance the imbibement of the message more deeply. The stories can be customised for language, regional and cultural nuances thus carrying forward the agnostic nature of the lessons embedded in these tales.
Markus Sauerbeck, Priyadharshni Krishnan, Miia Remahl, Majid Alturki
CPRVR is a health training virtual reality app that aims to teach the basics of performing CPR (Cardiopulmonary Resuscitation) in a playful and accessible environment.
Markus Sauerbeck & Miia Remahl
Built (with instructions): https://miumay.itch.io/acidmind1?password=goldsmiths
The past year was challenging for many people in different ways. Not only the fact of being separated from beloved ones due to different restrictions but also the limited opportunities to free your mind after a stressful day and stimulate yourself by a new experience. The Virtual Reality experience “Acid mind” brings in new visual and audio stimuli to your everyday life. Explore a world of colours in an interactive and relaxing way - safe, legal and without side effects - immersed in the virtual world.
The application is using the latest hand tracking technology and combines different techniques in shaders, VFX and sound to a playful experience. It can be seen as a virtual exhibition of interactive art pieces, manipulated by the user and generated with the Unity engine.
We learned to combine techniques of Virtual Reality, hand tracking, shaders, VFX as well as sound reactive shaders and VFX. For us it was a huge success what we measure with our ability of collaborating on the project via Teams and in person. We followed an MVP (minimal viable product) approach by combining individual developed pieces to one scene. As soon as one teammate found out exciting approaches like a sound reactive shader we also combined it with techniques from the other teammate.
Kevin Kuhn Agnes, Sasha Jiang, Yiru Yu, Ye Fu, Sachin Kodati
Ascend VR is a meditation application designed for a single HMD user that runs for approximately 10 minutes. It takes the user on a guided meditation, immersing the user in responsive virtual environments with 360 graphics, spatial sounds and haptic feedback.
Throughout the experience, the user is guided by a voice narration that delivers breathing and visualisation exercises. The user is guided through 4 scenes: Void, Ocean, Sky and Space. Beginning as a fish swimming to the surface of the ocean, then a seabird flying towards the horizon and finally a celestial entity in space, the journey of ascension is both physical and spiritual.
Targeting an audience who experience difficulties with traditional meditation methods, Ascend VR helps the user quickly settle into a meditative state through its seamless combination of immersive qualities and meditation techniques. Through the narrative, the user is meaningfully connected to virtual environment and more likely to achieve mindfulness.
I want to share my childhood nightmare in this project. I watched the movie "Deep Rising" accidentally when I was a little girl and it left an indelible horrible memory to me. In this project, I added Shader Graph, Flocking System, Post- Processing and the swimming elements in it. All these techniques built an underwater world and I used audio to enhance the atmosphere and tension.
Two of the most interesting techniques I learned this time are shader graph and post-processing. I applied the shader to fishes, all textures and animations are designed inside Unity, and their swimming speed, texture colors, flocking methods can all customized. The post-processing changed the game world became smoother and more beautiful, which can let players feel more immersed.
‘Non-Euclidean Room Tour’ is a physical walking and controller-free VR experience prototype that performs a time shift as well as a space transition simultaneously. Users can find any living room size space and experience interactions with their body, like playing instruments with their own hands and walking with feet.
Hands are the principal interaction, also the more difficult parts to work out. Based on Oculus Quest’s hand tracking package, hand tracking and gesture detection were implemented. Even though there is no physics haptics, prompt feedback gives different levels of tactile illusion to different testers. By introducing the concept of ‘Non-Euclidean’, the virtual room was compressed by 3 spaces to fit into one room-scale experience. The implementation was technically based on the stencil buffer. It is designed to implement impossible geometry illusion in the maze, also as a solution that solves the space problem posed by walking. Functions in my previous project are redesigned, hand menu, grabbing, drawing and portals are upgraded to more exquisite interactions.
The goal of this prototype is to give users a more intuitive feeling. During the process, two questions arise: Is walking the locomotion way that will not break the illusion? To what extent walking instead of the controller movement can significantly reduce symptoms of motion sickness? This project gives a very positive answer. The project practices the concept of a potentially infinite world illusion. In terms of haptics, a Wireality with worn multi-string hands haptics potentially is a good solution for the project afterwards.
The aim was to create an interactive experience that lets people connect with classical art pieces, such as Vincent vanGogh’s “Bedroom in Aries”. With all the opportunities offered by the technology both from the creative and technical side, I delivered a piece with far-reaching innovative storytelling through the use of the immersive VR medium.
Experimenting with gaze interaction had been viewed positively in a development process it was used for “magical” scene transformation between the gallery space and the visited painting. The sound and environment reacting to the player’s input is creating a perspective of releasing the time that has been stopped on an original 2D painting
With the small user testing between my housemates, I collected some valuable feedback and measured the impact of the experience on first-time VR users, which showed promising results for further future development of the deep place and plausibility illusion.
The Temple of Illusions
The Temple of Three Illusions is an exploration of optical illusions within virtual reality. Set in a Greek temple enclosed within in a mystical forest, the audience is invited to observe and experience three optical illusions. The idea stems from the notion that virtual reality is itself an illusion (of place, plausibility and presence) made possible via our sensory organs and computer graphics. I’m interested in seeing optical illusions in reality transfer to virtual reality, but especially the moments when certain physiological illusions experienced cannot be digitally documented. There is the philosophical ponder into the duality of realness: what you have experienced is undoubtedly real, but has no place in reality - even simulated ones. As we strive for increasingly realistic simulations, it is worth taking the occasional step back to consider the perceptual nature of our experiences in virtual reality.
Save the Dogs
Video about the Walking Pad: https://youtu.be/D7eeEsnUzK0
In a western town, a red devil imprisoned all the dogs and you were chosen one to save them!
In this game, the player is encouraged to explore the large environment using a walking pad, a device that detects the player's walking-in-place feet gestures. Town villagers are spread across the map to engage and support player's quest. From the knight sheriff, the player will receive a watch and a compass that will point to the dogs' location. And when the player reaches the prison, they must fight the boss to release the dogs.
Using their feet to walk and hands to interact, the player will move through the environment, talk to villagers, punch enemies, pull gates and pet rescued dogs.
House Party VR
Meet at your home – worldwide.
Covid kept us away from spending our time with our beloved ones. With House Party VR you can invite your friends to your digital – real home and bring back the feeling you may missed for a long time. This application is a social-VR multiplayer concept. The user scans in his real environment by photogrammetry and within seconds he is able to invite friends from all over the world to it. By placing games and interactivities the user can benefit of all affordances from Virtual Reality. So: let’s start the party and connect!
For this application I used an – for me – entire new game engine (Unreal Engine) what was a huge benefit for me in terms of graphic quality and development speed. For coding I learned the node based coding of blueprints what felt much faster compared to a Unity development. I tried out different available photogrammetry apps that are working with the lidar scanner and received low poly model as well as high detailed output (e.g. with Reality Capture). I assume that this proposed workflow of bringing in real environment within your own virtual space will become soon a standard for social AR/VR (especially with further software developments with the Lidar scanner software).
4Data was initially a sample collection of demo scenes that aimed to explore various data visualizations and narratives in VR. After further development, the project ended up with a focus to explore a more specific visualization technique, which was scatterplots. The experience in 4Data allows the user to explore a dataset that studies the relationship between temperatures and wind speeds on the ozone layer.
This project aims to help people release their pressures in real life. The basic idea why to build an immersive scene like a dream is people can feel fully relaxed in the dream, so they can release the pressure without any concern. The player in the scene can smash the texts and other objects by pressing the trigger and grip button on the controller (the button in front of the controller and the button on the side of the controller) – player can shoot by pressing the button in front of the controller and throw by pressing the button on the side of the controller. Also, make sure there is no more unexpected pressure on the operation, the speed and angle of the bullet were already calculated so the player does not need to shoot the objects with a very accurate angle.
Every object in the scene has its meaning like the 3D Texts represent every single pressure in the real-life; The Cat represents yourself in the past or future; The Heart shows in the end, represents your determination and the Broken Screen indicates it is the end of the dream, you should wake up to face the real life.
The main technology in this project is custom-made Shaders, like the ripple effect on the wall and ground when you touch them, the reflection effect, the out light glow effect, the smashing effect, the broken screen (Post Processing after Camera Rendering). The ripple effect and the smashing effect also including the interaction with the player.
Plastizoic VR Experience
This experience is aiming to raise awareness among the audience. Plastic pollution is one of the biggest problems of society. I do believe that by using a VR medium and unconventional storytelling the issue can become more approachable and understandable by the audience. Data visualization in dimensional space is a powerful way to show abstract ideas in order to create awareness of what stands behind the numbers. Using non-linear storytelling the user is free to explore the space, interact with the objects and live within the narration, rather than be a passive recipient. From the technical point of view, the precision of on trigger events was very important, so the user can understand the message clearly without any disruption. I experimented with and implemented Nvida Flex Physics Beta for real-time interactions with particle-based simulation for the "plastic ocean" bubbles, that intensify the subliminal message.