product & design

Designing for AR with ARKit

Chris is... mainly a Product Designer at Novoda, but sometimes people see him opening pull requests on various projects, so nobody is even sure anymore.

Over the past two weeks at Novoda we’ve been playing around with Apple’s ARKit features, and we had tons of fun building demos with it, even though a few colleagues were wondering why we were walking around holding our phones up. Now we want to share our findings with you.

This post is one of the 4-part series on ARKit, where we talk about setting up a basic Augmented Reality (AR) app with ARKit, designing for AR, testing AR, and using Machine Learning to identify objects. In order to follow certain parts of this tutorial, you will need to set up an AR app to see the 3D models in action. You can learn how to do that with this post from the same series.

A rather important thing to mention is that there’s a small amount of coding involved in designing for AR. Despite the fact that this might sound scary to some, I believe it’s our responsibility as designers to account for the entire lifespan of our creations through the journey of building a product, even though they might be far from our immediate reach. I’m strongly in support of going the extra mile and making an effort to learn new working methods, tools, or even a programming language, in order to ensure perfect delivery of our designs.

Table of contents

  1. Getting started with 3D modeling
  2. Preparing for export
  3. Lighting the model in Xcode
  4. Supporting animations

Getting started with 3D modeling

To begin creating your first 3D model you will need a 3D modeling app. There’s a plethora of options out there, some of the most popular ones being Blender, 3DS Max, and Cinema 4D (C4D). For the purpose of this post, we’ll be creating our 3D models in C4D, but you can use whichever software you prefer, as long as you’re able to export your models in Collada 1.4 format (.dae extension).

The first important thing that we want to do when we open C4D is go to the Project Settings in the bottom right panel, and change the scale to meters, instead of the default centimeters. This is because ARKit’s unit of length is meters, and unless we use the same units, things will appear disproportionate in the real world compared to the size we create them in C4D.

Now that we got that out of the way, we can start designing the objects that we’re going to add into our world. Prior to this project I had no knowledge of modeling 3D objects at all. I spent a few days watching tutorial videos on YouTube to get familiar with C4D itself, what goes into creating an object (such as texturing and lighting), as well as the terms that are often used in 3D modeling.

For this tutorial we’ll use a couple of simple objects; this banana, which we’ll use in the AR app, and this low-poly tree to illustrate a few important steps in our workflow later.

TurboSquid is a good place to find paid and free 3D objects. Google also recently launched Poly, a library of 3D objects which you can download for free.

Preparing for export

At this point in the process there’s a few things to keep in mind—if we were to export this object and plug into our AR app, we would see an all-white projection of it without any kind of texture, and that’s no good. Textures need to be ‘baked’ into the model in order to be visible in the AR app. We’ll do that in a second.

Let’s set the anchor point of the object on its bottom; i.e. where it is supposed to touch the surface. If we don’t do this, the object might appear like it’s hovering over the surface as we move the device around. That’s because the object isn’t properly anchored to the surface. The low-poly tree will help illustrate this point better.

After positioning the anchor point correctly, let’s make sure to set the object’s coordinates to 0, 0, 0 on the X, Y, Z axis accordingly—namely, the origin point.

Another thing to keep in mind is to avoid using any sort of lighting in C4D. The best time to start thinking about lighting your scene is during development time in Xcode. Adding a light source to the scene in Xcode will allow us to have more control over its properties through ARKit’s APIs, as well as the ability to add lighting to a scene that hosts multiple objects. Just like in real life, objects do not carry a light source with them wherever they go, but are lit by external light sources. Doing this will make your project more granular, and thus easier to maintain.

Using a light source will also allow us to cast shadows onto a surface, making objects look like they belong in the real world, rather than look like stickers on your device’s screen. More on shadows later.

We are almost done preparing our model for the AR app. Now it’s time to ‘bake’ the texture so the AR app can pick it up. Hold on, what does it mean to ‘bake a texture’? Baking essentially generates an image file that contains the texture of every polygon of an object, called a UVW Map. UVW maps are three dimensional (hence the three U, V, and W coordinate identifiers), which allows them to wrap onto complex geometries.

In the case of our banana, this is an unnecessary step as the author already provides us with a texture, but this is something that in most cases you will have to do for your own 3D objects.

Back to the low-poly tree again! To bake a texture, we will select all of our layers and group them into a single unified object, then select Bake Object in the Objects menu. Check Single Texture and Replace Objects, PNG format, and your desired export location. A size of 1024x1024px seemed to maintain good details on the model when viewed through the AR app.

Now the model consists only of the texture we just baked, making all other materials redundant. Don’t worry if it looks blurry; that’s C4D downscaling the texture for faster inline rendering.

Before we export, it’s a good idea to clean up the file from any unused layers and materials. To remove any unused materials, right-click on one, then select Remove Unused Materials.

It’s now time to export our model! Generally, ARKit supports two files types; .dae, and .obj along with it’s .mtl file, but .dae is preferred. If you go the File menu and then Export, you’ll notice C4D supports both these options, but you might also notice that there are two .dae type exports—COLLADA 1.4 and 1.5. The latter wasn’t working for us, so we exported all our models in COLLADA 1.4.

When exporting to .dae enable the following options in the export window:

The .dae file has a reference to the UVW Map created earlier, or in this case the texture given with the 3D model. If you change the name of the texture file, you should also update the .dae file, too. You can easily do so by opening the file with any text editor. The texture is usually referenced in the first few lines of the code, but if you can’t find it, look for something ending in .png (or whichever file format you previously exported the UVW map to).

Lighting the model in Xcode

For the scope of this post, we’ll assume that you already have an AR app configured. If not, you can learn how to do so by reading this blog post. Let’s start by copying the .dae file as well as the UVW map into the art.scnassets folder. If the files are exported correctly, we should see our model in Xcode, along with its texture, in a 3D environment similar to the one of C4D.

This is the time to add a light source to our object. In Xcode’s Object Library panel on the bottom right, you can see a few types of light sources that you can add to your scene. A Directional light shining from 80–90 degrees overhead seems to be the most realistic one. In addition to the main Directional light, you may choose to add a secondary Directional light to illuminate certain parts of the object that aren’t directly illuminated by the main Directional light, in order to avoid completely black areas. Adjust the intensity of the light source until you get something that looks relatively realistic.

A crucial part of making virtual objects appear as if they’re part of the real world is to include shadows. Drop shadows—i.e. shadows that are cast onto the surface where the object is resting on—is what makes it look tangible and creates the illusion that it is in fact anchored to a surface.

To add a shadow, while still having the main Directional light selected, go into the Attribute Inspector panel and make sure Casts shadows is enabled, and the mode is set to Deferred.

So now that the light source can cast shadows, we need a surface on which to cast these shadows. From the Object Library, add a Plane object to your scene, which essentially is a surface without any thickness. Let’s adjust the size and placement of the Plane object to be right under your model (or put 0, 0, 0 on the object’s xyz coordinates), and now we should see its shadow cast on the Plane. Now there’s a very obvious problem; the Plane object has a texture, but the end goal is to have our model’s shadow cast on whatever surface is resting on, not a surface embedded in the model itself.

Fixing that is as easy as going to the Material Inspector while having the plane selected, and simply unchecking all color channels from Write To Color property under the Transparency settings. In addition to that, choose Constant on the Lighting Model dropdown menu. Now we should see a shadow under the 3D object while the surface is invisible, which is what we were going for. The surface might also appear darkened, but this effect won’t be visible in AR.

Xcode might ask you to convert the .dae file to a .scn file. That’s okay for now, but it might prove to be a problem if your models also animate. However, there is a way to fix that. We’ll be exploring that in a bit.

The default settings will cast a hard shadow on the surface, or in other words a shadow with clear lines. A more diffused shadow will look more realistic in most lighting conditions. To adjust diffusion, go back to the Attributes Inspector while having the Directional Light selected, and adjust the shadow sample radius and sample count until you get something that looks good. You can also decrease the transparency of the shadow colour for an optimal look.

The final step to achieving realism with ARKit is lighting your model in accordance with the environment. Imagine if you placed an object in a relatively dark room—the object would look quite dark too, right? The contrary goes for objects placed in a bright environment. This is something we have to take into account when designing for AR.

The ARKit API calculates and provides a Light Estimate value, which is the estimated luminance of a frame in lumens.

self.sceneView.autoenablesDefaultLighting = false;

let estimate: ARLightEstimate!
estimate = self.sceneView.session.currentFrame?.lightEstimate

Now that we have an estimated luminance value of our real world lights, it’s easy to adjust the intensity of our virtual light source. This way we ensure that there’s consistent lighting between the model and the environment around it.

let light: SCNLight!
light = self.lightNodeModel?.light
light.intensity = estimate.ambientIntensity

The model is now ready to be placed in the real world!

Supporting animations

Animating 3D objects has proven to be easier than we thought—the .dae format is able to store animations done in C4D, and there’s no extra steps involved to make them play. Easy as that!

Animating in C4D is similar to After Effects, so if you’re familiar with this software you won’t find it very difficult to adapt. Admittedly, C4D feels very old and clunky, and this process can turn out to be very slow and frustrating at times.

To export the animation, don’t forget to check the final “Export animation” box in the Collada 1.4 export window. ARKit will pick up the animation automatically and will play it in a loop, just like a gif.

That’s all fine and dandy… until we try to add a plane and Xcode asks us to convert the file to .scn, and as soon as we accept, the animation doesn’t play anymore. For some reason, the .scn format isn’t maintaining the animations of the .dae file. So we have to find another way to add our light sources and shadows, while still maintaining the animations of the model.

A fairly small amount of coding is required. Adding a light source and plane to your .dae file won’t require us to convert the file to .scn, but altering these objects’ properties will. Who’s to say we can’t do that programmatically, though?

let plane = self.planeNodeModel?.geometry!
 
plane?.firstMaterial?.writesToDepthBuffer = true
plane?.firstMaterial?.colorBufferWriteMask = []
plane?.firstMaterial?.lightingModel = .constant

This piece of code is the equivalent of unchecking all colour channels from the drawing the texture of the plane and setting the lighting model to constant, thus drawing only the shadows of any objects cast on it. For that to work, the plane must have a material applied to it. C4D doesn’t apply a material to a new geometry by default, so make sure you apply a basic material (doesn’t matter what it looks, as it’s not going to be visible).

Your model is now lit properly, anchored to the surface realistically, while also doing all sorts of crazy dance moves.

Conclusion

That’s about all you need to know to get started designing for AR! We went through using a 3D modeling app to create our object, texturing, adding lighting and shadows, animation, and finally plugging it in the AR app.

This is one of the 4-part series on AR, where we talk about setting up a basic AR app, designing for AR, testing AR (coming soon), and using Machine Learning to identify objects (also coming soon). You can find the project files of these two demos here.

Got any comments or questions? Hit me up on Twitter @BashaChris

Enjoyed this article? There's more...

We send out a small, valuable newsletter with the best stories, app design & development resources every month.

No spam, no giving your data away, unsubscribe anytime.

About Novoda

We plan, design, and develop the world’s most desirable software products. Our team’s expertise helps brands like Sony, Motorola, Tesco, Channel4, BBC, and News Corp build fully customized Android devices or simply make their mobile experiences the best on the market. Since 2008, our full in-house teams work from London, Liverpool, Berlin, Barcelona, and NYC.

Let’s get in contact