
2020 Innovation Grant Retrospective
The year was 2020. While the COVID pandemic raged on across the globe, most Americans retreated into their homes to bake sourdough bread, cultivate house plants, and buy exercise equipment. We watched shows about tiger kings, space bounty hunters smuggling orphans, British bake-offs, and the Rose family's new life in a small town, all while we tried to convince ourselves we really didn’t enjoy traveling anyway. Thankfully, I was awarded one of the 2020 Vituity Innovation Grants and had a project to occupy plenty of my time in quarantine. My idea was to create a virtual reality ER orientation treasure hunt for the Oculus Quest VR headset. Utilizing the immersion that VR provides, I propose we could allow new employees to become oriented to their new workspace before ever stepping foot there in real life. So, with my shifts reduced and the world shut down, I settled in to write my initial design document in August of 2020...


All that was left to do was model the ER in 3D, import it into the Unity game engine, and build the game! With a deep breath, I dove in. One of my residents who used to be an architect (convenient!) helped me with an initial 3D sketch of the department that I was able to tweak in Blender and bring into Unity.

My Trello project initially started with broad goals (e.g., “Level Design Completed”) and I quickly found myself whittling all the tasks apart into 5+ smaller, bite-sized accomplishments (walls completed, doors in-place, chairs placed, etc.) that would reassure me I’m making progress even when I only had 1-2 hours here or there to work. Once I got bored with level design, I moved on to prototyping the treasure hunt functionality.
September 2020

My main goal in the design was to make sure that the system was flexible for different items or locations, thus easily allowing, for example, a physician orientation and a nursing orientation within the same environment but using different collections of items. I created systems to keep track of game time and individual item times. I got a waypoint system working so that if you don’t know where an object is located you can trigger a waypoint to appear, which uses a custom shader to let it be seen through/over walls so that you can follow it no matter where you are and it disappears when you reach the destination.
​
I integrated a save system based on the user name to record high scores, as well as a small event-based achievement system to keep track of whether the user passed the tutorial (complete it under time limit and without using the waypoints) and even managed to incorporate an auto-upload of the scores to a google doc upon each completion of the game.
November 2020

Earlier this year I improved much of the user interface, added a pause menu and player settings, built a separate optional tutorial room to help introduce the user to VR and the treasure hunt functionality, and of course lots and lots of optimization.
January 2021

Finally, on April 30th, 2021 (right on schedule!), after hundreds of hours of modeling, programming, optimizing, and beta testing, I finally threw in the creative towel, declaring the project complete.
April 2021

Random Thoughts
Little-to-nothing happens in game development without intention and most things will take twice as long as you thought. This is reinforced to me every time I work on a new project. The creativity this allows and encourages is highly rewarding but can make timelines and expectations difficult to gauge. Developing a VR experience introduces many new design challenges. Given the immediate immersion achieved with VR, we must focus on emphasizing intuitive interactions as the user will expect the game world to work as it does in real life. Furthermore, working on an enterprise VR project, the potential user could easily be completely novice to gaming in general, much less VR. To be an adequate widespread solution it’s crucial to have safeguards in place to minimize the potential VR-related motion sickness as the product is deployed across a larger, diverse workforce.
Sometime in January 2021 I had a rude awakening when I finally installed the current test version of the program on the Oculus Quest headset that I intended to use for the final project. Prior to this I had been running the game on my powerful PC hardware where it ran great. Of course, putting it on the limited Quest hardware showed me quickly that I was in for a crash course on optimization. Though frame drops and poor performance can be easily dismissed in a traditional gaming experience, the presence of these in VR can quickly lead to motion sickness and discomfort. Subsequently, through a steady diet of YouTube videos, endless forum posts, a host of plug-ins, numerous Discord servers, and a few of my own dances with motion sickness, I learned to harness the power of the Unity optimization tools, frustrum and occlusion culling, baked lighting, textures, and meshes, texture atlases, and script improvements to achieve an acceptable performance even on a stand-alone device.
Proof-of-Concept
During initial beta testing I had a couple residents try out the program and on their first attempts they were able to complete the entire treasure hunt with little-to-no assistance – I had successfully made the environment close enough to real life for them to be familiar and comfortable navigating the department they already know. My expectation/hypothesis is this feeling can be obtained in the opposite direction as well, allowing people who have never been in the space feel comfortable on their first visit.
Virtual Doors
I will summarize days of keyboard smashing to say that I hope to never have to make a physics-based door work in VR again and I wouldn’t wish it upon my worst enemies. God speed full-time VR developers.
The Little Things
In the end, I appreciate the little things I took a detour to incorporate (yes, the doors included). Users log in with their name which will then appear on their badge in-game that they use to sign into computers/open doors. There is background music in the game and if you find the radio you can play, pause, or change the song. Having the scores auto-upload provides incredibly easy data collection for research and I will certainly continue to incorporate this functionality across my other personal projects. I ended up managing to incorporate comfort options for VR, allowing the user to toggle a field-of-view vignette and alternate between continuous or teleportation locomotion (causes less motion sickness) on a dynamic pause menu. These “extra” tasks helped give me a boost to keep going when I was tired and end up making the project feel much more complete.
Thank You!!!
Finally, I absolutely have to say thank you to the Vituity Innovation Grant team for supporting this project. The innovation grant funding allowed me the flexibility to focus on this project and accelerate its development. I am excited to demo this out with my incoming residents and traveling nurses over the coming year and collect some data on how it helps their comfort and orientation in their first days and weeks within the department.
