Given the incredible success of my teapot-stacking AR app, I was hungry for more - especially with all of ARKit's improvements following on from WWDC 2018! So here's how I built an AR multiplayer Jenga game. In a week.
Before I answer that question we should cast our minds back to the start of June, and the WWDC keynote. Just as the previous year, it was captivating and I left with countless ideas about how we could improve our applications with the newly-introduced features. Apple held a focus on improving performance, especially for the older devices (hey Android, take note!). Yet still, they announced new functionalities for Siri, notifications, and an improved Stocks app! It’s hard not to be excited about that…
Although news about the Stocks app is incredibly thrilling, it was the new functionality introduced in ARKit 2.0 that really caught my attention. Outside of apps, it’s now possible to take a ‘quick look’ at 3D objects (a great way to make e-commerce more engaging), measure objects with your phone (leave those measuring tapes at home!), and tongue tracking allows for new interactions with Animojis… whatever they are. But, the headline for me was the sharing of AR experiences in real time across multiple devices.
Wouldn’t that just be great for a teapot-stacking AR game?!
In the end, I decided against stacking teapots, and instead decided to work on a new project – giant Jenga! I also decided to fully immerse myself into the beta, so I parked up Unity and dived head first into Swift! What could possibly go wrong…
What I got up to:
Before I made a start on any code, I took the opportunity to check out Apple’s sample game SwiftShot – a 6 person multiplayer game where two teams battle against each other to knock over the other team’s catapults. Example projects like these help to expand on the technical documentation and press releases – showing the new technology in action and how Apple sees it as being used.
After I stopped playing around with the game, I made notes about the various architectural patterns Apple used and created my own project. By the end of Monday, I had added horizontal plane detection to the project and I was able to detect surfaces around the office!
Unity vs SceneKit:
At this point, SceneKit really shined. It was incredibly easy to open up a project – I didn’t need to do anything beyond creating the project and adding a few lines of code. While there is a bias towards SceneKit (seeing as I’ve been an iOS developer far longer than I’ve been using Unity!) so far, I was having a better experience with SceneKit and Swift. At this point, anyway…
What I got up to:
After sleeping on the idea, I decided to ‘borrow’ the SwiftShot project, and add the Jenga game into that.
“But, why?!”, I hear you ask. Well, Apple had obviously invested a lot of time in perfecting this project. It’s nicely architectured, clean to read and navigate, plus it makes use of various technologies. I realised I only had a week to replicate lots of this, and that would not have been enough if I’d started from scratch. I would have had to handle physics, networking, scene mapping, and interfaces myself – why reinvent the wheel if it’s already been so nicely made?
With that in mind, I began to strip out the SwiftShot project of all unnecessary functionalities – leaving a base project I could add my game into.
Well, about that. The software in use is still in beta, and I was slowed down by a number of crashes. Attempting to edit some objects lead to crashes, unresponsive apps, and some strange rendering issues with the 3D editor.
By the end of Tuesday, I had a stripped down project ready to begin working on, on Wednesday!
Unity vs SceneKit:
At this point, I was starting to wish that I was using Unity. As an editor built specifically for this, it seemed far more mature and reliable. Furthermore, lots of the functionalities that I needed within SwiftShot was already provided in Unity!
What I got up to:
Powered by blind optimism, I took the project I cleared out on Tuesday and built the Jenga tower. After this, I began making each block interactive – what’d be the point in a Jenga game that couldn’t be played?!
Then, suddenly I noticed something. The sample project came with an extra feature I hadn’t yet discovered – the AR objects can be scaled. Giant Jenga: meet SUPER GIANT JENGA!
Unity vs SceneKit:
For this stage, I do feel that Unity would be been the better tool to use. Again, it feels more mature and this task would have taken far less time.
What I got up to:
I continued fully configuring the blocks so they could be interacted with.
This took a while longer than I had anticipated – partly due to my understanding of maths which is somewhat lacking (I should have paid more attention at college!) but also due to a way that the project was architected. I had inherited how the sample project was structured, and was therefore heavily limited as to parts I could expand on. For example, interaction events on a block were passed down between multiple layers and upon arrival, a significant amount of data was lost. For the sample project this wasn’t needed, but now it was! After a great deal of time rearchitecting and debugging, I finally managed to perform a raycast and detect when a block had been touched.
Unity vs SceneKit:
Yet again, Unity came out on top for this. I feel that this functionality would be simpler to integrate, and furthermore would be more widely documented – this functionality in SceneKit seemed to not be documented while I was creating this project!
What I got up to:
Having got to a point where blocks could be interacted with, I had to decide how to neatly tie-off the project, having spent my week on it. In the end, I decided to make the block disappear after it had been selected – while this wasn’t authentic Jenga behaviour it allowed me to bring the project to a neat close while allowing for further expansion down the line.
I ran the project, then realised that the physics were yet to be fully configured – leaving me with a floating Jenga tower. A few lines of code later, and the tower functioned as expected!
Unity vs SceneKit:
I’m starting to see a bit of a theme here: Unity would again have been a nicer tool for the job!
I began the week believing that I would prefer SceneKit, yet at the end of the week, I felt that Unity delivered the required functionalities in a better manner. Seeing as my previous games had been built away from Unity, it felt strange to suddenly favour it! However, as a more mature platform with superb documentation and tutorials, even after one week of using it I’d settled in with it very well, and would be likely to choose that in the future! This doesn’t mean however that SceneKit is necessarily the wrong option – some projects may require diving into code that Unity helpfully extracts, or certain features may be easier to construct away from an editor.
What does this mean in the future for my other AR ideas? In the end, I can’t help but think more about Unity.
Published on July 24, 2018, last updated on March 15, 2023
We dedicate 15% of our team's time to explore emerging technology and work on projects they're passionate about. So far we've developed a Jenga game in augmented reality, an app that mimics the human eyes and an interactive map that tacks natural disasters in real-time.
Read more