Rory.games Event Planning VR

Rory.games

Event Planning VR

Event Planning VR was the research piece created for my final year dissertation project of my BSc in Games Programming. It was developed to test and compare the difference and nuances between both the standard PC learning environment and a fully tracked virtual reality environment.

The base scenario was that of planning and building an arena for a concert, with the user being scored based on their; testing performance, equipment usage, and objective completion. Each user was recorded while using the system, allowing for later playback and analysis of their experience.

The system was developed within Unity 5.5, with the VR version being built for use with the HTC Vive VR headset. To help with this, the VR Toolkit (VRTK) was used to help bridge interactions between SteamVR and the systems' codebase. Three complete systems were developed for the project; the VR version, the PC version, and a separate playback version for the recorded data.

Creation of the project was split into four main segments; research, development, testing, and analysis.

To gain a better understanding of the task at hand, research into a number of key areas was undertaken. Primarily a large portion of the research fell into the category of gamification of learning, discovering what work had been already achieved within the field and how previous systems had been implemented. A look into the current VR market gave insight into both people's interests, with the public having a greater interest in learning material than games. Market predictions also gave a look into the number of VR users that both the present and future will hold, with an estimated 130 million yearly shipments of VR devices by 2025. The research showed that there is definitely the appeal and desire for VR content, with many people wanting to achieve things that may prove more costly or far more logistically challenging within normal scenarios.

The development stage of the system was primarily focused around the VR system, which took a considerably greater amount of time to develop than the others. There were a few key principles that were adhered to when developing the system to ensure successful testing and completion.

Both testing systems had to have exactly the same content throughout the entire process, meaning all features of one had to be replicated to the other as closely as possible. All explanations, tools and equipment had to operate in the same way and must have a resulting or equivalent version within the other.

Secondly any information that the user receives would be echoed within the other system. Underlying workings, objective goals, and general explanations were all explained to the user based on their relevant usage methods for their system.

Finally, the interaction for both systems had to be as similarly implemented as they could be. This meant that if one button or trigger was used in VR then a single button or click should be required on the PC side.

Ensuring the user received a similar experience each time was paramount to the testing, this led to the development of a three stage system. Each user would receive a fully guided learning scenario where they would be taught about how the equipment works and how to follow their objectives. This was explained via an in-game character that was fully voiced, allowing the user to focus on their task, rather than having to read text. During the tutorial the character would listen and wait for inputs to ensure the user had understood what to do.

After the tutorial stage the user would be presented with their planning stage, where they would place all their equipment and complete their objectives. The user was required to complete a number of simple objectives, however had the freedom to use as much or as little equipment as long as they were met.

Finally the user would then undertake the testing scenario, where they would be given a song to play through. Their lighting and sound levels needed to match those provided on their switchboard, meaning they were required to be relatively active during this stage.

The user's actions and positions would be recorded automatically during the planning and testing stages, with the system sending small messages when certain actions were performed. By doing this it meant that the analysis stage was both considerably easier and improved data wise, being able to analyse the user's performances in post with all of their actions allowed for greater comparisons.

An arena was created for the system that consisted of three main areas; the arena floor, stage and equipment balcony. The stage was directly affected by the lighting equipment that the user placed, with different lights providing greater or weaker effects. Lighting the stage meant covering all of the parts, which in turn required the user to both fulfil the requirements of having enough light power, as well as stage coverage. During the testing stage, a crowd entered the arena floor to act as the test for the sound coverage. As with the lights, the sound levels were calculated with both the use of crowd coverage and the overall power of the speakers.

The VR version made use of; a physical toolbox (for equipment), a physical clipboard (for objectives), a physical switchboard (for changing volume and light levels), and direct object interaction (being able to pick up and move objects with their hands/controllers). Both the toolbox and clipboard were holstered to the player's waist, ensuring simple and easy usage.

Each controller housed a set of options on the touchpads, the left controller offering options to do with movement and teleportation, while the right housed arena based tools. When the user touched any of their equipment with either of the controllers, the menus on them would change to that of the options available for equipment. For example, the lights could have their colours changed. If the user wanted to remove equipment then they simple dragged it into the bin on the side of the toolbox.

Comparatively, the PC version had the majority of items replaced with UI elements. The toolbox, switchboard and objectives were all replicated as on screen UI elements, however just like with the VR version they were all able to be kept open or visible at all times. Equipment interaction was replicated with left clicks for movement and right for options.

Testing for the system was completed over the course of a month, with the PC testing being completed remotely by participants. Each user had one full play through of the system, and was then asked to complete a survey based on their experience. The survey was almost identical for both experiences, with slight changes tailored towards each system's specific interactions. All users were also asked about any changes or modifications they may make to the systems, this was to give an insight into a more personalised aspect of their experience.

Analysis was completed after both versions had finished testing, with the playback version of the build being developed during the analysis. Due to the nature of the original versions of the system, the playback version was incredibly easy to create (same engine, same environment, and removed majority of interaction). Surveys for both versions were compared, along with the statistics and patterns acquired from within the recorded data.

It was found that VR users were spent a considerable amount more time adjusting, repositioning, changing and customising their setups. This was correlated with VR users generally finding the controls and interaction with equipment to be of greater ease compared to the PC users. PC users also found general movement to be slightly challenging versus the VR users, this was mostly due to the implementation of barriers between the upper and lower areas of the arena. VR users were not able to fly around freely, meaning they would not be able to place lights or speakers in irregular locations, the barriers were in place to ensure consistency between versions.

VR users tended to find the planning and testing scenarios to be of greater enjoyment than the tutorial stage, the hands on engagement provided a more stimulating experience. The majority of comments about the VR experience were angled towards the simulation actually giving them the feel of being in the arena, acting as a proper DJ or crew member. Countering this, the PC users did not feel quite as immersed, however did still generally enjoy the objectives and experience. There were a number of comments about lacklustre controls from certain users, however others felt that they were of perfect implementation.

Overall the final project created an engaging and interesting experience for the users, with high quality results and data being produced for both groups. The process helped improve my overall system development skills, while also allowing me to gain an insight into the translation of real world items and tools into the virtual reality space. Throughout the project lifecycle I also had the experience of presenting my work in both London, and at the Beijing Normal University Zhuhai as part of the Bournemouth University Global Festival of Learning in China. I achieved an overall first grade for the project.