For his innovation time, our Web Team Lead, Kevin brought the agency into the metaverse. Read our latest blog to see how he did it...
Time for the latest and greatest to come out of Freeish Time! Just to refresh your memory, “Freeish Time” is the opportunity for our Cubes to focus on their passion projects outside of their work commitments. They have an entire week to identify a problem, build and create a solution, then present it to the rest of the team during our Friday Happy Half Hour meeting.
This has become an integral part of 3 Sided Cube. The encouragement and time to work purely on whatever their heart desires, all while getting a little reprieve from the hustle and bustle from the normal workload is a welcomed treat. Whether they opt to go the “innovation” or “education” route, that invaluable time to focus purely on whatever it is that intrigues them is pretty rad.
The sky’s the limit when it comes to Freeish Time. We got to sit down with Kevin, our RIDICULOUSLY talented Web Team Lead, to chat about his exciting innovation bringing 3 Sided Cube into the Metaverse…
I’ve always been super excited about new tech, and something I’ve always wanted to explore is virtual reality. So when this time came up it seemed like the perfect opportunity to use my week of innovation to really dig in and explore VR. The fact that Cube now has VR goggles available to us made it an easy decision!
I’ve also been excited about how 3D gaming works. Merging the two made sense, having the opportunity to explore building a VR experience of the office was a great way to look at how to build a 3D world and play with Virtual Reality.
I broke down the task into two parts.
Before I began I picked up Sam’s 3D version of the office that he had built with MagicaVoxel during his innovation time.
Part 1. Setup a 3D version of the office in Unity.
Unity is a popular game engine that allows developers to build out 3D and 2D scenes that you put together into a game.
Here are some examples of things that have been made with it
To begin with I imported the work that Sam had done. As you can see you the first challenge was trying to figure out why there was no colour.
To get this to work we were missing an important part of the export from MagicaVoxel. This was a PNG that contained colour information that could be applied to a material. A material is a Unity object that has detail about how the a model is coloured (it also includes various properties like how shiny the model should be). Importing this PNG then building a material out of it and then applying it to the office model gives us what we need.
And voila! We have colour…
The next step was to add some First Person Controls, this being, so the user can move around the model with the keyboard and mouse.
Here’s a good tutorial I followed on the process:
In short, I created what is called a “Rig” that includes a body and camera – the camera essentially being the user’s eyes.
The body and camera listen to the mouse and keyboard inputs and move accordingly:
At this point I was wondering why the colours didn’t look right. The textures and lighting looked different from the rendered model in MagicaVoxel. This is because this information is not compatible with Unity.
It has to be replicated within Unity.
Adding lighting was an easy addition here. To achieve this I applied a technique called emissive lighting. Where light is not a direct spotlight but essentially a glow.
This tutorial was great in teaching me how to apply that:
Unfortunately I couldn’t adjust separate parts of the model as they were all one object. One learning I got from this was that the model must be broken down, for instance, the glass doors should be a separate object that I can then make semi transparent! This will be a great thing to explore in the next iteration of this project!
Throwing the movement and lighting together gave me this…
Part 2.
The tech
The next step in my VR adventure was to create what is called a “XR” rig.
XR is an umbrella term that includes the following types of applications:
The great news was that Unity has been working hard to make it easy for developers to work with XR, (You can checkout some of the documentation here); They have built an entire framework around it with plenty of tools and implementations of different VR kits to make it as universal as possible.
The VR headset we chose to develop with was the HTC Vive Focus 3. It’s an impressive bit of kit with some impressive stats.
Stats aside I did have a few issues getting the XR toolkit and Unity to communicate with the Vive headset (it would seem that the favourite choice amongst developers are the Oculus headsets)
Eventually, after doing some research I figured out that I had to install some extra plugins with Unity to get the Vive to communicate with the XR toolkit.
Putting the two together
The XR toolkit really does a lot of the heavy lifting here (once I got the headset working with it) – the toolkit essentially maps the inputs from the headset into a set of universal inputs and outputs (much like how previously the letter “W” was mapped to going forwards)
The XR toolkit allowed me to apply what’s called “Continuous movement” to my existing “Rig” by letting the user move around with the joystick and looking around with the headset.
The main challenge was getting the Vive headset to communicate with Unity’s XR toolkit. It seemed that it hadn’t been terribly supported and required a bit of hacking to get it to work!
Definitely creating virtual spaces where people can communicate with each other. Especially in a hybrid working environment seems to be the greatest opportunity here. Imagine having a virtual office where everyone can join into a meeting?!
It could also be a great opportunity for users to interact with data visualisations in a way that isn’t stuck to a laptop screen but where the user could walk around it!
Published on March 28, 2022, last updated on March 31, 2022