EmbraceVR was an immersive art piece making use of emerging VR / AR tech (including the Teslasuit, ZED projection camera, Oculus Rift and more) to teach players the importance of connecting with your fellow humans using the power of hugs!

Fellow programmer Zachary Wilken and I started out with only a basic concept for what the game should be about, and told to make something that used all the tech we were given. The premise was simple: depressed NPCs are walking around visibly decaying from loneliness as they murmer tales of woe, requiring the player to walk up and hug them (after asking for permission first). They reciplicate the kindness by hugging back, making them cheerful and regenerating the decay. The world itself is slowly falling apart as well, but begins to visibly brighten and become jubilant as more people are saved from solitude. It's up to you to save the world with hugs!










Most of the gameplay was fully prototyped in a day and the rest fleshed out over the first week. The design was simple: walk up to someone, literally ask them for a hug (with actual voice recognition using a microphone, there's one built into the Vive), when they responded positively hug them till they hug back, and then repeat x many times till the world is saved. All with placeholder art assets simulating what would be the models and teslasuit-wearing hugger. Integrating into VR was also a cinch, as I'd worked with SteamVR before and already familiar with using the Vive in Unity projects.

Upgrading from placeholder boxes to people was done with the help of fellow animation interns Wasan Hayajneh and Lumi Sume. We were sponsored by RenderPeople.com and allowed to use a select few of their standing person models, but lacked any of the specific hugging / turning / idle sequences we envisioned, and having fluid / realistic movements is absolutely critical for immersion in any VR experience. Enter Wasan and Lumi, who were recruited practically overnight and quickly made us everything we needed and more, with very efficient minor edits when needed for specific models (One was a child, men / women needed different idle poses, etc). Me and Zach went through a crash course in rigging animations to set them up in-game, after which we sat in a holding pattern for the critical equipment to arrive.


The Teslasuit arrived a few weeks later, and true to its name it was a suit made for shocking. It's initial default power level was enough to completely numb my arm and leave lasting red marks where the sensors touched, and after some testing about 5% of the default starting voltage was a 'comfortable' shock. Combined with some tinkering using the suit's vibration function we ended with a vaguely hug-like sensation as long as the suit was in direct contact with the user's skin, meaning players would have to remove their shirts before putting it on for the full effect.

Syncing suit movements in the game was much less forward, as the mocap sensors on the suit were glitchy if they worked at all. Arms would flail indiscriminately when you stood still, movements would randomly be mirrored in the opposite direction, and standing still trying to calibrate with a standard t-pose felt looked like a random-yoga-pose generator on the screen. The suits SDK was made available to our team and we manually debugged / placeholder-fixed as much as we could, but there were plenty of hardware issues that required working with / patching from the Europe-based creators. Most of the issues were solved literally on the last day when they pushed a firmware update based directly on our feedback.













Near the end of production, the other pieces of equipment slowly started to arrive. ZED cameras were clipped on the front of the Vives, allowing us to record the real world while the player was looking through the headset and superimpose our VR characters where they should be on the Vive's screen. Projectors allowed us to display images on the real-life walls instead of in-game virtually-superposition walls to make the game noticable for onlookers. Wireless support for the Vive arrived that 'would' have helped expand the limited space we were allowed to walk around, but the ZED camera still needed its own power cables from on top of the Vive so the difference was minimal.

After some fiddling on-site, the display was working and patrons were lining up to test it out. Most players didn't take off their shirts for the full hugging effect of the teslasuit, but appreciated the interactive sections and for many it was their first experience with virtual reality. Spectators could glimpse at what the players were seeing through our developer console, but were more keen on watching the players jump around on stage exclaiming when they felt the characters hugging back or watch someone they ignored dissolve, and the entire display itself was constantly changing from grayscale to colorful due to the projectors as players progressed through the game. The players enjoyed it, the show managers enjoyed the novelty of it, our head manager Skye was pleased with it, and despite the bumps in development it was a cute concise VR experience.